Jan 30 18:30:22 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 18:30:22 crc restorecon[4691]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:22 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 18:30:23 crc restorecon[4691]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 18:30:24 crc kubenswrapper[4782]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.150996 4782 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154796 4782 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154821 4782 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154829 4782 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154836 4782 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154842 4782 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154850 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154856 4782 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154864 4782 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154870 4782 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154877 4782 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154884 4782 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154889 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154894 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154905 4782 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154910 4782 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154916 4782 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154922 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154927 4782 feature_gate.go:330] unrecognized feature gate: Example Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154931 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154936 4782 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154941 4782 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154947 4782 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154952 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154959 4782 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154964 4782 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154968 4782 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154974 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154979 4782 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154984 4782 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154989 4782 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154994 4782 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.154999 4782 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155004 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155009 4782 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155014 4782 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155020 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155034 4782 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155040 4782 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155047 4782 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155052 4782 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155058 4782 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155064 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155069 4782 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155074 4782 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155079 4782 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155084 4782 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155089 4782 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155093 4782 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155098 4782 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155103 4782 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155108 4782 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155113 4782 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155119 4782 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155125 4782 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155132 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155138 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155143 4782 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155148 4782 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155153 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155157 4782 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155162 4782 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155167 4782 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155172 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155176 4782 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155181 4782 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155185 4782 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155197 4782 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155202 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155209 4782 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155215 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.155221 4782 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156323 4782 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156341 4782 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156362 4782 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156370 4782 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156378 4782 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156385 4782 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156394 4782 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156401 4782 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156407 4782 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156412 4782 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156419 4782 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156424 4782 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156430 4782 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156436 4782 flags.go:64] FLAG: --cgroup-root="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156441 4782 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156447 4782 flags.go:64] FLAG: --client-ca-file="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156453 4782 flags.go:64] FLAG: --cloud-config="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156459 4782 flags.go:64] FLAG: --cloud-provider="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156464 4782 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156474 4782 flags.go:64] FLAG: --cluster-domain="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156480 4782 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156486 4782 flags.go:64] FLAG: --config-dir="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156491 4782 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156498 4782 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156505 4782 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156511 4782 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156517 4782 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156522 4782 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156528 4782 flags.go:64] FLAG: --contention-profiling="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156534 4782 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156540 4782 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156561 4782 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156567 4782 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156575 4782 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156582 4782 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156588 4782 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156594 4782 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156599 4782 flags.go:64] FLAG: --enable-server="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156605 4782 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156616 4782 flags.go:64] FLAG: --event-burst="100" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156622 4782 flags.go:64] FLAG: --event-qps="50" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156628 4782 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156633 4782 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156639 4782 flags.go:64] FLAG: --eviction-hard="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156646 4782 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156652 4782 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156657 4782 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156663 4782 flags.go:64] FLAG: --eviction-soft="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156669 4782 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156675 4782 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156681 4782 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156687 4782 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156692 4782 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156698 4782 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156703 4782 flags.go:64] FLAG: --feature-gates="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156710 4782 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156716 4782 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156721 4782 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156727 4782 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156733 4782 flags.go:64] FLAG: --healthz-port="10248" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156739 4782 flags.go:64] FLAG: --help="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156744 4782 flags.go:64] FLAG: --hostname-override="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156749 4782 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156755 4782 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156760 4782 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156766 4782 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156772 4782 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156785 4782 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156791 4782 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156796 4782 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156802 4782 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156808 4782 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156836 4782 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156843 4782 flags.go:64] FLAG: --kube-reserved="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156849 4782 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156855 4782 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156860 4782 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156866 4782 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156872 4782 flags.go:64] FLAG: --lock-file="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156877 4782 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156883 4782 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156889 4782 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156902 4782 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156908 4782 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156913 4782 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156919 4782 flags.go:64] FLAG: --logging-format="text" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156925 4782 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156930 4782 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156936 4782 flags.go:64] FLAG: --manifest-url="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156941 4782 flags.go:64] FLAG: --manifest-url-header="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156949 4782 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156956 4782 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156963 4782 flags.go:64] FLAG: --max-pods="110" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156969 4782 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156975 4782 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156980 4782 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156986 4782 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156991 4782 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.156997 4782 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157003 4782 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157017 4782 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157023 4782 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157028 4782 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157042 4782 flags.go:64] FLAG: --pod-cidr="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157047 4782 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157057 4782 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157063 4782 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157069 4782 flags.go:64] FLAG: --pods-per-core="0" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157074 4782 flags.go:64] FLAG: --port="10250" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157080 4782 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157086 4782 flags.go:64] FLAG: --provider-id="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157092 4782 flags.go:64] FLAG: --qos-reserved="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157098 4782 flags.go:64] FLAG: --read-only-port="10255" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157104 4782 flags.go:64] FLAG: --register-node="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157109 4782 flags.go:64] FLAG: --register-schedulable="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157115 4782 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157124 4782 flags.go:64] FLAG: --registry-burst="10" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157130 4782 flags.go:64] FLAG: --registry-qps="5" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157136 4782 flags.go:64] FLAG: --reserved-cpus="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157141 4782 flags.go:64] FLAG: --reserved-memory="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157148 4782 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157154 4782 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157160 4782 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157165 4782 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157171 4782 flags.go:64] FLAG: --runonce="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157176 4782 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157182 4782 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157188 4782 flags.go:64] FLAG: --seccomp-default="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157193 4782 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157199 4782 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157204 4782 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157211 4782 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157217 4782 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157223 4782 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157249 4782 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157255 4782 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157260 4782 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157266 4782 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157272 4782 flags.go:64] FLAG: --system-cgroups="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157291 4782 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157301 4782 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157307 4782 flags.go:64] FLAG: --tls-cert-file="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157312 4782 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157326 4782 flags.go:64] FLAG: --tls-min-version="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157332 4782 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157337 4782 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157342 4782 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157348 4782 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157354 4782 flags.go:64] FLAG: --v="2" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157362 4782 flags.go:64] FLAG: --version="false" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157370 4782 flags.go:64] FLAG: --vmodule="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157377 4782 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157382 4782 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157553 4782 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157559 4782 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157566 4782 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157573 4782 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157580 4782 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157586 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157592 4782 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157597 4782 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157604 4782 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157610 4782 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157622 4782 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157628 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157634 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157640 4782 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157646 4782 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157651 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157657 4782 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157662 4782 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157666 4782 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157671 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157676 4782 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157681 4782 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157696 4782 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157702 4782 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157706 4782 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157711 4782 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157716 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157721 4782 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157726 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157730 4782 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157735 4782 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157740 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157747 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157752 4782 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157757 4782 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157762 4782 feature_gate.go:330] unrecognized feature gate: Example Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157768 4782 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157774 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157780 4782 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157785 4782 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157790 4782 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157794 4782 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157802 4782 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157807 4782 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157811 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157816 4782 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157821 4782 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157826 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157833 4782 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157839 4782 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157845 4782 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157850 4782 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157856 4782 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157861 4782 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157867 4782 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157872 4782 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157877 4782 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157882 4782 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157895 4782 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157917 4782 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157923 4782 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157929 4782 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157934 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157940 4782 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157945 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157950 4782 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157955 4782 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157959 4782 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157964 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157969 4782 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.157974 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.157990 4782 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.172006 4782 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.172057 4782 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172189 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172199 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172206 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172213 4782 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172220 4782 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172244 4782 feature_gate.go:330] unrecognized feature gate: Example Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172250 4782 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172255 4782 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172261 4782 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172266 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172271 4782 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172277 4782 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172282 4782 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172289 4782 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172295 4782 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172304 4782 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172317 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172324 4782 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172330 4782 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172337 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172343 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172350 4782 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172357 4782 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172364 4782 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172371 4782 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172377 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172383 4782 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172400 4782 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172407 4782 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172413 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172419 4782 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172426 4782 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172435 4782 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172445 4782 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172455 4782 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172462 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172468 4782 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172477 4782 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172486 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172494 4782 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172502 4782 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172509 4782 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172516 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172522 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172529 4782 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172535 4782 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172541 4782 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172547 4782 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172554 4782 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172560 4782 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172566 4782 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172573 4782 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172579 4782 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172586 4782 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172592 4782 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172598 4782 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172604 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172610 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172618 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172625 4782 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172633 4782 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172641 4782 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172648 4782 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172657 4782 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172664 4782 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172670 4782 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172677 4782 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172684 4782 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172692 4782 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172699 4782 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172706 4782 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.172718 4782 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172893 4782 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172905 4782 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172914 4782 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172920 4782 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172926 4782 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172931 4782 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172936 4782 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172942 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172947 4782 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172952 4782 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172957 4782 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172964 4782 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172970 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172975 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172980 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172986 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172991 4782 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.172996 4782 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173001 4782 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173006 4782 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173012 4782 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173017 4782 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173022 4782 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173027 4782 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173033 4782 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173038 4782 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173043 4782 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173048 4782 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173053 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173058 4782 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173064 4782 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173070 4782 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173075 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173080 4782 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173085 4782 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173090 4782 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173095 4782 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173102 4782 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173108 4782 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173114 4782 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173120 4782 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173125 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173131 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173137 4782 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173142 4782 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173148 4782 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173153 4782 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173158 4782 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173164 4782 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173169 4782 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173174 4782 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173179 4782 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173184 4782 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173190 4782 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173195 4782 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173200 4782 feature_gate.go:330] unrecognized feature gate: Example Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173206 4782 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173211 4782 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173216 4782 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173221 4782 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173249 4782 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173255 4782 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173260 4782 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173265 4782 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173270 4782 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173275 4782 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173280 4782 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173287 4782 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173293 4782 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173299 4782 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.173304 4782 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.173313 4782 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.174666 4782 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.179930 4782 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.180026 4782 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.185470 4782 server.go:997] "Starting client certificate rotation" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.185514 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.185755 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-06 15:27:27.672241836 +0000 UTC Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.185901 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.212602 4782 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.216552 4782 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.216657 4782 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.241036 4782 log.go:25] "Validated CRI v1 runtime API" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.281990 4782 log.go:25] "Validated CRI v1 image API" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.284716 4782 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.289625 4782 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-18-25-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.289679 4782 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.313187 4782 manager.go:217] Machine: {Timestamp:2026-01-30 18:30:24.309135632 +0000 UTC m=+0.577513727 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:543d713b-1dda-4bd4-bcbd-c7e5af310fc0 BootID:cfb15d18-0b1e-4acd-9f2a-ea49a1afc5bb Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f1:c7:75 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f1:c7:75 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8a:bc:2c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:99:18 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:82:fd:75 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ce:d4:a9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:d4:fe:46:c4:88 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:69:1d:6d:7f:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.313708 4782 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.314003 4782 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.314657 4782 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.314972 4782 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.315028 4782 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.315437 4782 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.315493 4782 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.316042 4782 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.316829 4782 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.317217 4782 state_mem.go:36] "Initialized new in-memory state store" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.317410 4782 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.321037 4782 kubelet.go:418] "Attempting to sync node with API server" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.321083 4782 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.321189 4782 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.321266 4782 kubelet.go:324] "Adding apiserver pod source" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.321304 4782 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.326354 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.326440 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.326514 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.326538 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.328762 4782 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.330031 4782 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.332266 4782 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335670 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335714 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335728 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335744 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335767 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335783 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335796 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335820 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335835 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335853 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335895 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.335910 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.336843 4782 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.337762 4782 server.go:1280] "Started kubelet" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.339168 4782 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.339416 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.339165 4782 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 18:30:24 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.340443 4782 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.341596 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.341635 4782 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.341689 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:51:58.254326363 +0000 UTC Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.341890 4782 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.341927 4782 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.341950 4782 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.342126 4782 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.342697 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.342794 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.343945 4782 server.go:460] "Adding debug handlers to kubelet server" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.344152 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.345017 4782 factory.go:55] Registering systemd factory Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.345218 4782 factory.go:221] Registration of the systemd container factory successfully Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.346097 4782 factory.go:153] Registering CRI-O factory Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.346204 4782 factory.go:221] Registration of the crio container factory successfully Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.346524 4782 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.346631 4782 factory.go:103] Registering Raw factory Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.346669 4782 manager.go:1196] Started watching for new ooms in manager Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.353567 4782 manager.go:319] Starting recovery of all containers Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.347786 4782 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f95be93e87305 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 18:30:24.337703685 +0000 UTC m=+0.606081750,LastTimestamp:2026-01-30 18:30:24.337703685 +0000 UTC m=+0.606081750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363509 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363571 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363585 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363598 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363612 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363628 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363642 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363656 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363672 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363684 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363698 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363710 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363725 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363741 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363753 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363767 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363779 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363791 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363825 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363840 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363856 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363871 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363885 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363898 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363914 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363949 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363967 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.363985 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364014 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364032 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364050 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364066 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364113 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364132 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364152 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364167 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364182 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364212 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364248 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364282 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364297 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364312 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364325 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364338 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364353 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364366 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364380 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364393 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364406 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364427 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364450 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364467 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364489 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364510 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364526 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364544 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364563 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364579 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364592 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364606 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364621 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364636 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364649 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364663 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364677 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364691 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364704 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364719 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364733 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364747 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364762 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364775 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364788 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364803 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364817 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364829 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364845 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364859 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364873 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364888 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.364944 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369150 4782 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369304 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369340 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369403 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369431 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369452 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369473 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369494 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369516 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369541 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369562 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369599 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369618 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369641 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369664 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369683 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369706 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369726 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369746 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369767 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369787 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369808 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369829 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369853 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369889 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369921 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369949 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.369976 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370003 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370027 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370051 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370075 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370099 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370122 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370149 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370171 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370193 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370212 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370257 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370278 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370326 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370346 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370367 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370392 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370412 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370434 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370457 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370477 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370498 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370519 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370542 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370563 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370584 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370604 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370634 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370654 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370674 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370694 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370715 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370739 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370806 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370827 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370850 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370870 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370889 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370908 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370928 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370949 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370969 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.370989 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371009 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371033 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371056 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371077 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371101 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371136 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371158 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371179 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371201 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371224 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371268 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371289 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371310 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371328 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371348 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371368 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371392 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371415 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371435 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371454 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371500 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371526 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371547 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371567 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371588 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371618 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371640 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371660 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371680 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371701 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371722 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371741 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371759 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371782 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371802 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371827 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371848 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371871 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371891 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371912 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371933 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371955 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371976 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.371996 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372016 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372037 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372059 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372079 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372101 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372123 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372145 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372165 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372187 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372210 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372252 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372272 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372293 4782 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372313 4782 reconstruct.go:97] "Volume reconstruction finished" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.372327 4782 reconciler.go:26] "Reconciler: start to sync state" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.393874 4782 manager.go:324] Recovery completed Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.403543 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.406265 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.406301 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.406315 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.406360 4782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.408042 4782 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.408061 4782 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.408082 4782 state_mem.go:36] "Initialized new in-memory state store" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.409451 4782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.409483 4782 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.409509 4782 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.409631 4782 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.410426 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.410549 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.431536 4782 policy_none.go:49] "None policy: Start" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.432446 4782 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.432496 4782 state_mem.go:35] "Initializing new in-memory state store" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.442359 4782 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.501871 4782 manager.go:334] "Starting Device Plugin manager" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.502433 4782 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.502452 4782 server.go:79] "Starting device plugin registration server" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.503155 4782 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.503198 4782 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.503657 4782 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.503807 4782 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.503828 4782 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.509820 4782 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.509956 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.512379 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.512436 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.512450 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.512706 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.513056 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.513131 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517155 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517180 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517207 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.516955 4782 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517322 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517335 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517357 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.517691 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.518331 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.518406 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520324 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520351 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520364 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520760 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520820 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.520964 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.523602 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.523642 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.523659 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524320 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524362 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524380 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524319 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524464 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524477 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524641 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524905 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.524957 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526059 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526083 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526093 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526292 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526309 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526353 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526365 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.526320 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.527203 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.527252 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.527262 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.545018 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.574859 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.574938 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.574965 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.574985 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575051 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575091 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575109 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575148 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575194 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575215 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575253 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575291 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575325 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575344 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.575362 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.603764 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.605222 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.605332 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.605365 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.605419 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.606440 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677127 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677277 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677337 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677387 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677478 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677506 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677526 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677575 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677623 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677635 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677668 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677717 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677657 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677736 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677758 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677519 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677616 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677814 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677840 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677959 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677849 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677886 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678004 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.677874 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678032 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678069 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678099 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678126 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678019 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.678490 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.806990 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.808843 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.808908 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.808928 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.808970 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.809653 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.855913 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.872358 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.893384 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.901004 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: I0130 18:30:24.905363 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.914719 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0b9f0b45465a11640160869f6c026da9ad77178520fe31265ba5fbafab1dfed1 WatchSource:0}: Error finding container 0b9f0b45465a11640160869f6c026da9ad77178520fe31265ba5fbafab1dfed1: Status 404 returned error can't find the container with id 0b9f0b45465a11640160869f6c026da9ad77178520fe31265ba5fbafab1dfed1 Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.916745 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-989c5758a509508263bdf305a8d8026d031005384d40bcea62709e570d521d46 WatchSource:0}: Error finding container 989c5758a509508263bdf305a8d8026d031005384d40bcea62709e570d521d46: Status 404 returned error can't find the container with id 989c5758a509508263bdf305a8d8026d031005384d40bcea62709e570d521d46 Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.929089 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-4505894d0f24c861c7926647850fc1d5dd2b8db8c41b12d93512d8a09cb8cbb0 WatchSource:0}: Error finding container 4505894d0f24c861c7926647850fc1d5dd2b8db8c41b12d93512d8a09cb8cbb0: Status 404 returned error can't find the container with id 4505894d0f24c861c7926647850fc1d5dd2b8db8c41b12d93512d8a09cb8cbb0 Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.939996 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1563a654d2ddad8c2aa87add570a6295ca29e125462d5b32d4f63102957660db WatchSource:0}: Error finding container 1563a654d2ddad8c2aa87add570a6295ca29e125462d5b32d4f63102957660db: Status 404 returned error can't find the container with id 1563a654d2ddad8c2aa87add570a6295ca29e125462d5b32d4f63102957660db Jan 30 18:30:24 crc kubenswrapper[4782]: W0130 18:30:24.940597 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-613dfbbfb9f77ca7591b54040dc5a332f3a42496a0e9ffe09c1c7c8ab6312d3f WatchSource:0}: Error finding container 613dfbbfb9f77ca7591b54040dc5a332f3a42496a0e9ffe09c1c7c8ab6312d3f: Status 404 returned error can't find the container with id 613dfbbfb9f77ca7591b54040dc5a332f3a42496a0e9ffe09c1c7c8ab6312d3f Jan 30 18:30:24 crc kubenswrapper[4782]: E0130 18:30:24.945987 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.210048 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.211755 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.211818 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.211834 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.211874 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.212607 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.340738 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.342853 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:34:28.027485661 +0000 UTC Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.421355 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b9f0b45465a11640160869f6c026da9ad77178520fe31265ba5fbafab1dfed1"} Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.422523 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"613dfbbfb9f77ca7591b54040dc5a332f3a42496a0e9ffe09c1c7c8ab6312d3f"} Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.423941 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1563a654d2ddad8c2aa87add570a6295ca29e125462d5b32d4f63102957660db"} Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.425647 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4505894d0f24c861c7926647850fc1d5dd2b8db8c41b12d93512d8a09cb8cbb0"} Jan 30 18:30:25 crc kubenswrapper[4782]: I0130 18:30:25.427166 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"989c5758a509508263bdf305a8d8026d031005384d40bcea62709e570d521d46"} Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.748013 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Jan 30 18:30:25 crc kubenswrapper[4782]: W0130 18:30:25.773749 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.773914 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:25 crc kubenswrapper[4782]: W0130 18:30:25.787800 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.787882 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:25 crc kubenswrapper[4782]: W0130 18:30:25.820207 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.820403 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:25 crc kubenswrapper[4782]: W0130 18:30:25.938746 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:25 crc kubenswrapper[4782]: E0130 18:30:25.938880 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.013647 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.015613 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.015668 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.015680 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.015718 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:26 crc kubenswrapper[4782]: E0130 18:30:26.016366 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.308466 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 18:30:26 crc kubenswrapper[4782]: E0130 18:30:26.309950 4782 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.341929 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.343275 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 12:13:35.411089446 +0000 UTC Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.433092 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8" exitCode=0 Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.433188 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.433360 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.434710 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.434753 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.434766 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.435169 4782 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="348a74197ce5ccab0d174c2fa16ac7422a0839e958a9ff9ab5379306aff0f63d" exitCode=0 Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.435381 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.435361 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"348a74197ce5ccab0d174c2fa16ac7422a0839e958a9ff9ab5379306aff0f63d"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.436633 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.436683 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.436699 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.437625 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.437886 4782 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8b6bfb7643574cb3ce70b6d4d5bda09c012e92c012046a54b05b9a1ee1982244" exitCode=0 Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.437986 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.438028 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8b6bfb7643574cb3ce70b6d4d5bda09c012e92c012046a54b05b9a1ee1982244"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.438870 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.438900 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.438915 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.439039 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.439065 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.439075 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.441000 4782 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="94bf45ed7925ffa89143457ea0f4a7bad21498f9cb4a24523c97d65c3a7000fd" exitCode=0 Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.441065 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"94bf45ed7925ffa89143457ea0f4a7bad21498f9cb4a24523c97d65c3a7000fd"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.441136 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.442169 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.442197 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.442209 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.445028 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04cce4444fd40cff3896cb7085e5f5c874770d282bf660e3eb7ba1141e0f0a01"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.445074 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"77a0be3e68f11584e3bc7a8863e78036e721e9df9c64e0bb9927e5c718aa562c"} Jan 30 18:30:26 crc kubenswrapper[4782]: I0130 18:30:26.445091 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ee770bfa5252c9d774f9f21d0f942eb6eeac2f5c8708cc9daa5afed482debec0"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.340850 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.343989 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 10:32:17.769629668 +0000 UTC Jan 30 18:30:27 crc kubenswrapper[4782]: E0130 18:30:27.349921 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.451829 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.451878 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.451892 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.456980 4782 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e535fbce3e1249d17792d1e94cc58e996bd63f39557ae39cecd1cd6f455b6b4b" exitCode=0 Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.457134 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.457363 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e535fbce3e1249d17792d1e94cc58e996bd63f39557ae39cecd1cd6f455b6b4b"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.458329 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.458357 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.458367 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.463196 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5174793e6ecb916cc58f5f14d71c6c3f7e7ff3d73748bbeeecb58070bc6d882b"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.463289 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.464593 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.464649 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.464663 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.468590 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"accd5fb7b03f7c1e164d37b6a570784f44f778a406ade0970ddfa1d1d2b439d9"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.468633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e72b786ece4c8568f2446c86ce71efabb048f754582b5a35827814611ef0e7e9"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.468648 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"78b4c4497a17f59b7b2a847f32f9877702a6caa0119eab19f5410f0026f5d643"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.468614 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.470176 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.470246 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.470260 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.475918 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"728cc6c30e2beeea5b30c93859b8a6f6b78958ac2835c3510d4e52d2c2e834b9"} Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.476026 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.476938 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.476965 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.476976 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:27 crc kubenswrapper[4782]: W0130 18:30:27.565013 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:27 crc kubenswrapper[4782]: E0130 18:30:27.565152 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.616548 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.618182 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.618264 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.618281 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.618321 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:27 crc kubenswrapper[4782]: E0130 18:30:27.619035 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Jan 30 18:30:27 crc kubenswrapper[4782]: I0130 18:30:27.852476 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.340946 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.345144 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:35:16.291872201 +0000 UTC Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.484202 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805"} Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.484322 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98"} Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.484859 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.486082 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.486128 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.486141 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487393 4782 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="35231ac95f38bf985de2ffc81114c60d0abd5e1bccc22f1df10b61e9c980575b" exitCode=0 Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487590 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487620 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487638 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"35231ac95f38bf985de2ffc81114c60d0abd5e1bccc22f1df10b61e9c980575b"} Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487628 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.487750 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.488543 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.488796 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.488820 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.488835 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489315 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489367 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489387 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489407 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489442 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489457 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489642 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489659 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.489672 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:28 crc kubenswrapper[4782]: I0130 18:30:28.822919 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.346306 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:39:27.766305722 +0000 UTC Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.498071 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.498117 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.498173 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.498310 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.498937 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"380c5a8333597c38ca345290ee8de3ddc65c8ed7582efc7ab7da7af9ff7e8a89"} Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499030 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a8eaf96b2d2045eb92e7820a989c00b5c7883d0f44830435fbc21c8d75e40e1e"} Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499056 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9035dcf5d9ec4ed1dcd144417dd235ee7bda4d30da4ffebea2b962814cac1925"} Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499070 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b2cebc0f18742d47839f139928ba942d1cb409d32f44c7e99056355aaa8366a5"} Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499787 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499834 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.499851 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.500831 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.500924 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.500945 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.500909 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.501006 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:29 crc kubenswrapper[4782]: I0130 18:30:29.501029 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.347283 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 14:42:33.507666362 +0000 UTC Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.512557 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"97391d1c35d2240f02a54ab02c5d50fe3dc99daa2d47ba5a4fc42f612519ed20"} Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.512690 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.512707 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.512882 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514096 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514175 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514195 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514208 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514301 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.514333 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.706441 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.819951 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.821889 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.821945 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.821965 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.822006 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:30 crc kubenswrapper[4782]: I0130 18:30:30.893740 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.348031 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 22:29:47.523856254 +0000 UTC Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.515203 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.515457 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.515277 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516447 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516481 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516495 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516792 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516828 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.516846 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:31 crc kubenswrapper[4782]: I0130 18:30:31.959587 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.162508 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.349027 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:21:44.872733065 +0000 UTC Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.517612 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.517860 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.518678 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.518726 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.518745 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.519181 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.519213 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:32 crc kubenswrapper[4782]: I0130 18:30:32.519248 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:33 crc kubenswrapper[4782]: I0130 18:30:33.349336 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:39:11.116388679 +0000 UTC Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.349903 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:20:07.878010933 +0000 UTC Jan 30 18:30:34 crc kubenswrapper[4782]: E0130 18:30:34.518120 4782 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.682691 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.682939 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.684382 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.684419 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:34 crc kubenswrapper[4782]: I0130 18:30:34.684435 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:35 crc kubenswrapper[4782]: I0130 18:30:35.350530 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:45:09.866420617 +0000 UTC Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.292804 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.293091 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.295168 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.295219 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.295245 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.351519 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 05:09:03.68385998 +0000 UTC Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.456135 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.465539 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.527962 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.529013 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.529069 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.529089 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:36 crc kubenswrapper[4782]: I0130 18:30:36.533160 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.352441 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 15:29:33.016088212 +0000 UTC Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.399323 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.530702 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.532248 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.532362 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:37 crc kubenswrapper[4782]: I0130 18:30:37.532390 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.352901 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 21:37:24.88196401 +0000 UTC Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.533864 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.535578 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.535672 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.535700 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:38 crc kubenswrapper[4782]: W0130 18:30:38.571961 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.572113 4782 trace.go:236] Trace[295471144]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 18:30:28.570) (total time: 10001ms): Jan 30 18:30:38 crc kubenswrapper[4782]: Trace[295471144]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:38.571) Jan 30 18:30:38 crc kubenswrapper[4782]: Trace[295471144]: [10.001772373s] [10.001772373s] END Jan 30 18:30:38 crc kubenswrapper[4782]: E0130 18:30:38.572171 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 18:30:38 crc kubenswrapper[4782]: W0130 18:30:38.587101 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 18:30:38 crc kubenswrapper[4782]: I0130 18:30:38.587525 4782 trace.go:236] Trace[133669068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 18:30:28.585) (total time: 10002ms): Jan 30 18:30:38 crc kubenswrapper[4782]: Trace[133669068]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:38.587) Jan 30 18:30:38 crc kubenswrapper[4782]: Trace[133669068]: [10.002074745s] [10.002074745s] END Jan 30 18:30:38 crc kubenswrapper[4782]: E0130 18:30:38.587630 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 18:30:39 crc kubenswrapper[4782]: W0130 18:30:39.111399 4782 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.111526 4782 trace.go:236] Trace[1961875596]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 18:30:29.109) (total time: 10002ms): Jan 30 18:30:39 crc kubenswrapper[4782]: Trace[1961875596]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:39.111) Jan 30 18:30:39 crc kubenswrapper[4782]: Trace[1961875596]: [10.002074964s] [10.002074964s] END Jan 30 18:30:39 crc kubenswrapper[4782]: E0130 18:30:39.111561 4782 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.341723 4782 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.353090 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 02:30:15.360584329 +0000 UTC Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.537438 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.538902 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805" exitCode=255 Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.538948 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805"} Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.539103 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.539818 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.539865 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.539886 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.540735 4782 scope.go:117] "RemoveContainer" containerID="349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.936116 4782 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.936230 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.942703 4782 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 18:30:39 crc kubenswrapper[4782]: I0130 18:30:39.942778 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.353831 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 19:07:14.483769567 +0000 UTC Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.399316 4782 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.399495 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.544533 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.546656 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab"} Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.547048 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.548408 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.548460 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.548480 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.903270 4782 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]log ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]etcd ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/priority-and-fairness-filter ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-apiextensions-informers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-apiextensions-controllers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/crd-informer-synced ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-system-namespaces-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 30 18:30:40 crc kubenswrapper[4782]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/bootstrap-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/start-kube-aggregator-informers ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-registration-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-discovery-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]autoregister-completion ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-openapi-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 30 18:30:40 crc kubenswrapper[4782]: livez check failed Jan 30 18:30:40 crc kubenswrapper[4782]: I0130 18:30:40.903343 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.355136 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 21:28:45.944403512 +0000 UTC Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.994427 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.994704 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.996442 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.996489 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:41 crc kubenswrapper[4782]: I0130 18:30:41.996503 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.011099 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.162628 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.162934 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.164747 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.164793 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.164816 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.272011 4782 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.356275 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 08:33:18.712989823 +0000 UTC Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.552752 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.554281 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.554351 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:42 crc kubenswrapper[4782]: I0130 18:30:42.554370 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:43 crc kubenswrapper[4782]: I0130 18:30:43.022126 4782 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 18:30:43 crc kubenswrapper[4782]: I0130 18:30:43.357642 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 05:33:59.807344454 +0000 UTC Jan 30 18:30:43 crc kubenswrapper[4782]: I0130 18:30:43.397487 4782 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.333565 4782 apiserver.go:52] "Watching apiserver" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.338545 4782 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.339192 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.339959 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.340534 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:44 crc kubenswrapper[4782]: E0130 18:30:44.340648 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.340963 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.341030 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.341036 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:44 crc kubenswrapper[4782]: E0130 18:30:44.341209 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.341655 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:44 crc kubenswrapper[4782]: E0130 18:30:44.341893 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.343741 4782 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.344628 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.344919 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.346340 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.347497 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.347732 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.347780 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.347813 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.347846 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.348152 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.358639 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 23:32:58.014214 +0000 UTC Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.389458 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.408441 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.430978 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.448170 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.465022 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.482337 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.498454 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.516094 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.530867 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.549130 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.564071 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.577495 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.592415 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:44 crc kubenswrapper[4782]: E0130 18:30:44.937423 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.938728 4782 trace.go:236] Trace[1368407360]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 18:30:33.601) (total time: 11336ms): Jan 30 18:30:44 crc kubenswrapper[4782]: Trace[1368407360]: ---"Objects listed" error: 11336ms (18:30:44.938) Jan 30 18:30:44 crc kubenswrapper[4782]: Trace[1368407360]: [11.336928602s] [11.336928602s] END Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.938764 4782 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 18:30:44 crc kubenswrapper[4782]: E0130 18:30:44.940151 4782 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.941775 4782 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 18:30:44 crc kubenswrapper[4782]: I0130 18:30:44.946020 4782 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042134 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042179 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042202 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042217 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042259 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042276 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042291 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042306 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042323 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042361 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042376 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042391 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042424 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042441 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042459 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042480 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042526 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042554 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042577 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042603 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042621 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042610 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042639 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042739 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042774 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042799 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042825 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042853 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042879 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042923 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.042984 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043015 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043042 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043071 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043101 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043156 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043180 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043202 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043232 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043272 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043296 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043301 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043323 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043360 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043390 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043421 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043446 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043476 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043500 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043524 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043549 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043554 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043574 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043600 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043605 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043627 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043651 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043781 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043812 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043871 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043891 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043916 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043943 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043971 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043989 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044017 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044042 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044064 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044086 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044106 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044126 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044145 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044163 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044183 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044201 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044231 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044276 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044293 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044310 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044334 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044361 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044381 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044399 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044418 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044435 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044451 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044471 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044491 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044510 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044527 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044548 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044576 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044594 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044612 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044629 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044650 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044668 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044685 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044705 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044725 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044743 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044765 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044838 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044889 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044908 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044924 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044940 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044977 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044995 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045015 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045037 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045062 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045080 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045097 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045115 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045134 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045154 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045173 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045190 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045208 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045228 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045257 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045278 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045301 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045318 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045335 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045353 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045372 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045419 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045442 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045463 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045482 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045500 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045517 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045538 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045554 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045573 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045593 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045614 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045632 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045651 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045699 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045738 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045756 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045773 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045789 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045807 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045831 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045849 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045867 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045883 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045899 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045919 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045938 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045958 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045976 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045994 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046012 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046034 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046053 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046073 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046091 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046111 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046129 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046145 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046162 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046180 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046196 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046213 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046656 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046683 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046700 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046716 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043791 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043927 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.043937 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047004 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044024 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044060 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044104 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044199 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044320 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044323 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044457 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044478 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044477 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044692 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044741 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.044973 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045004 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045016 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045043 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045114 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045268 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045375 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045437 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.045794 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046328 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046428 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046903 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.046953 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047602 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047636 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047667 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047666 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047686 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047710 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047732 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047750 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047773 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047778 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047817 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047787 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047793 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.047903 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:30:45.547868327 +0000 UTC m=+21.816246562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.047996 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048052 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048103 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048145 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048203 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048557 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048608 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048650 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048695 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048738 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048781 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048823 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048844 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.048863 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.049223 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.049285 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.049308 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.049333 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.049356 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052618 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052638 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052738 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052782 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052812 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052847 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052877 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052908 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052943 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052965 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052948 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.052994 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053039 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053063 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053093 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053106 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053259 4782 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053276 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053288 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053301 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053312 4782 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053324 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053336 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053347 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053386 4782 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053398 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053409 4782 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053429 4782 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053440 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053452 4782 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053464 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053476 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053487 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053497 4782 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053509 4782 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053532 4782 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053544 4782 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053556 4782 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053566 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053582 4782 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053592 4782 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053603 4782 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053613 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053623 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053635 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053645 4782 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053655 4782 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053664 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053675 4782 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053681 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053686 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053716 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053726 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053737 4782 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053749 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053759 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053770 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053336 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053530 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.053833 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.054058 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.054416 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.054616 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.056017 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.056256 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.056497 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.057148 4782 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.058576 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.059099 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.065812 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.066736 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.075172 4782 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.075369 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:45.575344088 +0000 UTC m=+21.843722113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.075477 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.075541 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.075776 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.075848 4782 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076064 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.076253 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:45.576206509 +0000 UTC m=+21.844584754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076290 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076450 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076019 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076430 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076820 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.076987 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.077013 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.077572 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.077914 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.078038 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.078498 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.078657 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.078927 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.079016 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.079188 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.079476 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.079619 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.079957 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.080200 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.080447 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.082380 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.082601 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.083448 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.083826 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.093467 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.093734 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.093919 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.094118 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.094751 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.096740 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.097272 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.100669 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.101213 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.101782 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.102214 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.102804 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.103313 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.103779 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.103935 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.105336 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.105515 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.105600 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.105661 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.105920 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.106752 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.106763 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107394 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107449 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107764 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107764 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107807 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.107830 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.108016 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.108372 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.108454 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.108671 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.109448 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.109734 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.110149 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.110635 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.111922 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.112448 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.112823 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.113179 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.113531 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.113798 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.116556 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.118587 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.118658 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.118930 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.119233 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.119470 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.119529 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.120053 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124201 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124353 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124632 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124728 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124840 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.124984 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.125517 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.125596 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.125901 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.125929 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.125975 4782 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.126093 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:45.626070728 +0000 UTC m=+21.894448763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.126661 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.129029 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.129696 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130029 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130220 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130424 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130500 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130612 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.130976 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.131400 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.131506 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.132100 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.132162 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.132870 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.133165 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.133195 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.133208 4782 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.133294 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:45.633273652 +0000 UTC m=+21.901651677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.133354 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.133725 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.131158 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.135145 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.135480 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.136111 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.136142 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.138509 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.138644 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.138779 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.138843 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.138961 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.140095 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.140153 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.140386 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.140521 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.142072 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.142110 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.142545 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.142619 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.142927 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.143142 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.144963 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.145483 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.146013 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.146505 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.153094 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.155361 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.155598 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.155634 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.156613 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157362 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157443 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157559 4782 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157577 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157594 4782 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157606 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157618 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157630 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157646 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157659 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157669 4782 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157679 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157693 4782 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157705 4782 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157715 4782 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157727 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157742 4782 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157753 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157764 4782 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157778 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157791 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157803 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157815 4782 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157829 4782 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157841 4782 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157852 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157864 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157879 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157892 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157903 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157916 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157932 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157943 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157957 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157973 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157985 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.157999 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158012 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158026 4782 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158038 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158050 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158061 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158077 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158087 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158099 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158113 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158126 4782 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158137 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158149 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158165 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158176 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158187 4782 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158199 4782 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158214 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158229 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158258 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158270 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158284 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158349 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158410 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158294 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158437 4782 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158453 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158466 4782 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158479 4782 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158509 4782 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158521 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158532 4782 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158548 4782 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158562 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158573 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158583 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158595 4782 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158611 4782 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158623 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158633 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158649 4782 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158673 4782 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158684 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158695 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158709 4782 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158720 4782 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158730 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158741 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158757 4782 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158768 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158778 4782 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158789 4782 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158804 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158815 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158826 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158839 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158849 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158858 4782 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158868 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158882 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158892 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158905 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158916 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158930 4782 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158941 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158952 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158965 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158975 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.158999 4782 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159012 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159027 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159039 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159052 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159062 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159076 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159089 4782 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159100 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159111 4782 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159124 4782 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159134 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159145 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159171 4782 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159182 4782 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159192 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159202 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159215 4782 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159247 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159260 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159300 4782 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159317 4782 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159328 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159353 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159368 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159386 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159398 4782 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159410 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159424 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159435 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159448 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159460 4782 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159474 4782 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159486 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159499 4782 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159512 4782 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159527 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159538 4782 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159551 4782 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159531 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159567 4782 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159619 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159652 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.159884 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.162819 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.165356 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.166028 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.168368 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.168791 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.170402 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.173227 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.173325 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.173693 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.182910 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.202344 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.205043 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.218769 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.250769 4782 csr.go:261] certificate signing request csr-cvsnr is approved, waiting to be issued Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.260838 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261063 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261158 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261261 4782 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261337 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261398 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261454 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261510 4782 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261567 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261622 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261677 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261729 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.261856 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.262045 4782 csr.go:257] certificate signing request csr-cvsnr is issued Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.278348 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.288354 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.359403 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 00:15:10.395481477 +0000 UTC Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.410011 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.410185 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.567119 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9"} Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.567176 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fb982de24d53a2b5d424132cb5f027d5ed07723ed2a1907873e22eeb2f77f671"} Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.567185 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.567323 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:30:46.567307255 +0000 UTC m=+22.835685270 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.568289 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"27529a84659eafedbc5e856a6ffbc5612adb6466d61e571b49fa466ea782d856"} Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.570165 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797"} Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.570258 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fcf07d82221a2dfcc27f6e037e708bb0c2b3973c7ac77bce4d7dbabfc6f9450f"} Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.584842 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.597603 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.608321 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.616863 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.626121 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.640442 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.668187 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.668250 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.668270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.668322 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668378 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668406 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668422 4782 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668426 4782 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668481 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:46.668467349 +0000 UTC m=+22.936845374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668535 4782 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668561 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668591 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668607 4782 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668559 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:46.668539061 +0000 UTC m=+22.936917086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668670 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:46.668647384 +0000 UTC m=+22.937025409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:45 crc kubenswrapper[4782]: E0130 18:30:45.668697 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:46.668688605 +0000 UTC m=+22.937066630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.904407 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.908534 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.916386 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.932044 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.942080 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.944074 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.954139 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.964499 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.975963 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.989521 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:45 crc kubenswrapper[4782]: I0130 18:30:45.999445 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.014860 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.028543 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.039491 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.055918 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.077382 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.264054 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 18:25:45 +0000 UTC, rotation deadline is 2026-12-11 21:51:54.168469604 +0000 UTC Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.264120 4782 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7563h21m7.904353124s for next certificate rotation Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.359971 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:36:36.956207186 +0000 UTC Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.410028 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.410081 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.410200 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.410316 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.415735 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.416215 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.417079 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.417699 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.418273 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.418733 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.419371 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.419886 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.420524 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.421014 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.421581 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.422205 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.422687 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.423249 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.423757 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.424305 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.424845 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.425219 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.426660 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.427642 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.428366 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.429142 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.429812 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.430755 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.431428 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.432321 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.433199 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.433907 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.435523 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.436371 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.437104 4782 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.437264 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.438636 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.439139 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.439584 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.440765 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.441444 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.442002 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.443706 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.444694 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.445177 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.445903 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.446567 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.447160 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.447679 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.448213 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.448746 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.449496 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.449992 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.450457 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.450913 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.451448 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.451987 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.453979 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.575677 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1"} Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.577409 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.577568 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.577549722 +0000 UTC m=+24.845927747 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.582718 4782 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.592495 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.610031 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.626332 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.642281 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.663055 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.676359 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.678698 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.678763 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.678806 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.678869 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.678956 4782 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.678966 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679006 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679021 4782 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679029 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.679007873 +0000 UTC m=+24.947385898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679080 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.679062074 +0000 UTC m=+24.947440099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679434 4782 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679487 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.679476174 +0000 UTC m=+24.947854419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679604 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679665 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679687 4782 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:46 crc kubenswrapper[4782]: E0130 18:30:46.679774 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.679748161 +0000 UTC m=+24.948126216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.701636 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.723494 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8cjnc"] Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.723805 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.725793 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.725921 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.726445 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.750977 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.768636 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.781456 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8cjnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"786fd7c6-c8be-4c4d-8c88-4e24747e78db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pkn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8cjnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.806159 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.823786 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.837617 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.851362 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.864922 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:46Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.881351 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pkn9\" (UniqueName: \"kubernetes.io/projected/786fd7c6-c8be-4c4d-8c88-4e24747e78db-kube-api-access-6pkn9\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.881430 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/786fd7c6-c8be-4c4d-8c88-4e24747e78db-hosts-file\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.983095 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pkn9\" (UniqueName: \"kubernetes.io/projected/786fd7c6-c8be-4c4d-8c88-4e24747e78db-kube-api-access-6pkn9\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.983164 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/786fd7c6-c8be-4c4d-8c88-4e24747e78db-hosts-file\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:46 crc kubenswrapper[4782]: I0130 18:30:46.983316 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/786fd7c6-c8be-4c4d-8c88-4e24747e78db-hosts-file\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.004809 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pkn9\" (UniqueName: \"kubernetes.io/projected/786fd7c6-c8be-4c4d-8c88-4e24747e78db-kube-api-access-6pkn9\") pod \"node-resolver-8cjnc\" (UID: \"786fd7c6-c8be-4c4d-8c88-4e24747e78db\") " pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.037463 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8cjnc" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.049615 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786fd7c6_c8be_4c4d_8c88_4e24747e78db.slice/crio-1ae58bc3900b59c321335cd2637665e198f9ab1a1c6702c0f9921ef084575dff WatchSource:0}: Error finding container 1ae58bc3900b59c321335cd2637665e198f9ab1a1c6702c0f9921ef084575dff: Status 404 returned error can't find the container with id 1ae58bc3900b59c321335cd2637665e198f9ab1a1c6702c0f9921ef084575dff Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.129556 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-47tfl"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.130339 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-p7zdh"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.130529 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.130667 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.133197 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.135009 4782 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.135077 4782 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.141515 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.141778 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.142383 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.142523 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.142663 4782 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.142729 4782 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.142803 4782 reflector.go:561] object-"openshift-machine-config-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.142829 4782 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.142896 4782 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.142982 4782 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.143618 4782 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.143696 4782 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.143997 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qdgpq"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.144985 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.149725 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.167172 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.171878 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.192756 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.209705 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8cjnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"786fd7c6-c8be-4c4d-8c88-4e24747e78db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pkn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8cjnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.241973 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.280627 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287164 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287221 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-netns\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287304 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cni-binary-copy\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287329 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-kubelet\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287360 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-system-cni-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287379 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjwj\" (UniqueName: \"kubernetes.io/projected/a030950a-4dec-42c1-a494-6ecd3413b010-kube-api-access-qpjwj\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287400 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eeb02b9-cc00-423a-87f6-2c326af45ceb-proxy-tls\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287421 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-system-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287439 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-socket-dir-parent\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287503 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwcj\" (UniqueName: \"kubernetes.io/projected/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-kube-api-access-wkwcj\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287534 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-conf-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287557 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5eeb02b9-cc00-423a-87f6-2c326af45ceb-rootfs\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287578 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287705 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-cnibin\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287753 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-k8s-cni-cncf-io\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287794 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287815 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-binary-copy\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287861 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-bin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287900 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-hostroot\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287923 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-multus\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287944 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-daemon-config\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287964 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-os-release\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.287984 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-os-release\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.288013 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-multus-certs\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.288033 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-etc-kubernetes\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.288065 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xd8r\" (UniqueName: \"kubernetes.io/projected/5eeb02b9-cc00-423a-87f6-2c326af45ceb-kube-api-access-4xd8r\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.288086 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cnibin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.302070 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.321848 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a030950a-4dec-42c1-a494-6ecd3413b010\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47tfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.333744 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eeb02b9-cc00-423a-87f6-2c326af45ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7zdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.345965 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.358253 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.360339 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:33:40.7798514 +0000 UTC Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.372475 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.385557 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-os-release\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-os-release\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389548 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-multus-certs\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389567 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-etc-kubernetes\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389594 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xd8r\" (UniqueName: \"kubernetes.io/projected/5eeb02b9-cc00-423a-87f6-2c326af45ceb-kube-api-access-4xd8r\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389609 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cnibin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389625 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389645 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-netns\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389669 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389682 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cni-binary-copy\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389698 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-kubelet\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389713 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-system-cni-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389715 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-multus-certs\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389802 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cnibin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389819 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-os-release\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389819 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-os-release\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389908 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-kubelet\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389904 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-etc-kubernetes\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389938 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-system-cni-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389966 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-netns\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.389732 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjwj\" (UniqueName: \"kubernetes.io/projected/a030950a-4dec-42c1-a494-6ecd3413b010-kube-api-access-qpjwj\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390009 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eeb02b9-cc00-423a-87f6-2c326af45ceb-proxy-tls\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390028 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-system-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390043 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-socket-dir-parent\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390103 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwcj\" (UniqueName: \"kubernetes.io/projected/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-kube-api-access-wkwcj\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390110 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390125 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-conf-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390147 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5eeb02b9-cc00-423a-87f6-2c326af45ceb-rootfs\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390150 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-socket-dir-parent\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390163 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390185 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-conf-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390223 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-cnibin\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390256 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-k8s-cni-cncf-io\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390264 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5eeb02b9-cc00-423a-87f6-2c326af45ceb-rootfs\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390280 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390298 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-binary-copy\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390315 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-bin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390330 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-hostroot\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390356 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-multus\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390376 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-daemon-config\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390412 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-system-cni-dir\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390485 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-cnibin\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390549 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-run-k8s-cni-cncf-io\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390567 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-cni-binary-copy\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390577 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-bin\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390613 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-hostroot\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390615 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.390640 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-host-var-lib-cni-multus\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.391004 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-multus-daemon-config\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.391022 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a030950a-4dec-42c1-a494-6ecd3413b010-tuning-conf-dir\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.391041 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a030950a-4dec-42c1-a494-6ecd3413b010-cni-binary-copy\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.401598 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8cjnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"786fd7c6-c8be-4c4d-8c88-4e24747e78db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pkn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8cjnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.404507 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.409333 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.410673 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.410785 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.412475 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjwj\" (UniqueName: \"kubernetes.io/projected/a030950a-4dec-42c1-a494-6ecd3413b010-kube-api-access-qpjwj\") pod \"multus-additional-cni-plugins-47tfl\" (UID: \"a030950a-4dec-42c1-a494-6ecd3413b010\") " pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.418673 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.419182 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.426217 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwcj\" (UniqueName: \"kubernetes.io/projected/c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6-kube-api-access-wkwcj\") pod \"multus-qdgpq\" (UID: \"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\") " pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.435928 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.449672 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.462077 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a030950a-4dec-42c1-a494-6ecd3413b010\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qpjwj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-47tfl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.465183 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-47tfl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.474515 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eeb02b9-cc00-423a-87f6-2c326af45ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7zdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.488879 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qdgpq" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.491426 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qdgpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkwcj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qdgpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.503613 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.518894 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.538497 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c6a78466a70e102fa53d61afbdcd70c9f1e78241cb86f09b0c3a677c9e14797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.549664 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4b78e2f-15ed-44b6-aa61-3e0bce6744b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77a0be3e68f11584e3bc7a8863e78036e721e9df9c64e0bb9927e5c718aa562c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee770bfa5252c9d774f9f21d0f942eb6eeac2f5c8708cc9daa5afed482debec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04cce4444fd40cff3896cb7085e5f5c874770d282bf660e3eb7ba1141e0f0a01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728cc6c30e2beeea5b30c93859b8a6f6b78958ac2835c3510d4e52d2c2e834b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.565885 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.582520 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8cjnc" event={"ID":"786fd7c6-c8be-4c4d-8c88-4e24747e78db","Type":"ContainerStarted","Data":"bb205af458c7e7c52bc1b603c9dd621b94d9a1bbb950b8b17aec4ac3418ed0e3"} Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.582565 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8cjnc" event={"ID":"786fd7c6-c8be-4c4d-8c88-4e24747e78db","Type":"ContainerStarted","Data":"1ae58bc3900b59c321335cd2637665e198f9ab1a1c6702c0f9921ef084575dff"} Jan 30 18:30:47 crc kubenswrapper[4782]: E0130 18:30:47.592036 4782 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.592418 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: W0130 18:30:47.594210 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e9fe26_a9ff_4c88_af9a_695c9a46ffe6.slice/crio-f482b725f2160608a10219ac28d5283c1cfa542b0b5af5f6138bafbaccb05686 WatchSource:0}: Error finding container f482b725f2160608a10219ac28d5283c1cfa542b0b5af5f6138bafbaccb05686: Status 404 returned error can't find the container with id f482b725f2160608a10219ac28d5283c1cfa542b0b5af5f6138bafbaccb05686 Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.604610 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8cjnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"786fd7c6-c8be-4c4d-8c88-4e24747e78db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pkn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8cjnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.619176 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eeb02b9-cc00-423a-87f6-2c326af45ceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4xd8r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7zdh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.636482 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qdgpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wkwcj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qdgpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.650059 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5f5b978-22ce-4ef1-8792-a3e12ba1af5c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T18:30:38Z\\\",\\\"message\\\":\\\"W0130 18:30:28.026276 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0130 18:30:28.026603 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769797828 cert, and key in /tmp/serving-cert-2374018079/serving-signer.crt, /tmp/serving-cert-2374018079/serving-signer.key\\\\nI0130 18:30:28.358424 1 observer_polling.go:159] Starting file observer\\\\nW0130 18:30:28.361143 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0130 18:30:28.361345 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0130 18:30:28.364294 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2374018079/tls.crt::/tmp/serving-cert-2374018079/tls.key\\\\\\\"\\\\nF0130 18:30:38.703405 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:27Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T18:30:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T18:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T18:30:24Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.664808 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.676342 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9952e180dbea5d2550954e5748e10e4780377a0b37feafb883434d8dcc4bfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20048a4370cc68d0636246767c213059d4797f6fb4c09778e857b2a22fc919c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T18:30:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.688413 4782 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T18:30:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.761347 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8cjnc" podStartSLOduration=2.761326015 podStartE2EDuration="2.761326015s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:47.746933049 +0000 UTC m=+24.015311084" watchObservedRunningTime="2026-01-30 18:30:47.761326015 +0000 UTC m=+24.029704030" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.792953 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=2.792921876 podStartE2EDuration="2.792921876s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:47.792334101 +0000 UTC m=+24.060712126" watchObservedRunningTime="2026-01-30 18:30:47.792921876 +0000 UTC m=+24.061299911" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.871478 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=0.871457955 podStartE2EDuration="871.457955ms" podCreationTimestamp="2026-01-30 18:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:47.870860411 +0000 UTC m=+24.139238436" watchObservedRunningTime="2026-01-30 18:30:47.871457955 +0000 UTC m=+24.139835990" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.916458 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lxk6x"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.917373 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.919507 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.919837 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.919922 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.920530 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.920646 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.920657 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.921421 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.951316 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.974272 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.978051 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lc8l9"] Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.978513 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.980431 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.980706 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.980770 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.980873 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.985972 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xd8r\" (UniqueName: \"kubernetes.io/projected/5eeb02b9-cc00-423a-87f6-2c326af45ceb-kube-api-access-4xd8r\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:47 crc kubenswrapper[4782]: I0130 18:30:47.998547 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095076 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095298 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095396 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095472 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095553 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095618 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-host\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095687 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095762 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095840 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.095929 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096004 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096191 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096288 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096317 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2zm\" (UniqueName: \"kubernetes.io/projected/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-kube-api-access-hr2zm\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096373 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096416 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4r8x\" (UniqueName: \"kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096447 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096486 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096511 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096640 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096775 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096808 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.096900 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-serviceca\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198174 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198267 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198292 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198318 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198369 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198414 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2zm\" (UniqueName: \"kubernetes.io/projected/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-kube-api-access-hr2zm\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198443 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198468 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198502 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198525 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4r8x\" (UniqueName: \"kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198563 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198581 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198597 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198614 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-serviceca\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198631 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198648 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198664 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198682 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198702 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198719 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198734 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-host\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198749 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.198765 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.199135 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m"] Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.199319 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.199375 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.199618 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.200316 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.200362 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.200887 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.200931 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201102 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201133 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201156 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201921 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201973 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.201996 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202172 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-host\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202296 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202298 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202325 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202364 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202373 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202522 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.202702 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.203044 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-serviceca\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.205271 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.207019 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.223705 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4r8x\" (UniqueName: \"kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x\") pod \"ovnkube-node-lxk6x\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.226331 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2zm\" (UniqueName: \"kubernetes.io/projected/4facc72e-1e36-4fa4-8544-5e60c51bdeb8-kube-api-access-hr2zm\") pod \"node-ca-lc8l9\" (UID: \"4facc72e-1e36-4fa4-8544-5e60c51bdeb8\") " pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.233208 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:48 crc kubenswrapper[4782]: W0130 18:30:48.248044 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1fb9ae_9c56_4d08_b0ef_c661158367ce.slice/crio-3b5b137ae156ad5ef5ed0bf5b934def26c5189163af9d7b46198d06ebd991cda WatchSource:0}: Error finding container 3b5b137ae156ad5ef5ed0bf5b934def26c5189163af9d7b46198d06ebd991cda: Status 404 returned error can't find the container with id 3b5b137ae156ad5ef5ed0bf5b934def26c5189163af9d7b46198d06ebd991cda Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.262901 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-d7zh6"] Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.263588 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.263754 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.294971 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lc8l9" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.299404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.299483 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.299539 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmlkw\" (UniqueName: \"kubernetes.io/projected/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-kube-api-access-cmlkw\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.299567 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.319056 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 18:30:48 crc kubenswrapper[4782]: W0130 18:30:48.322319 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4facc72e_1e36_4fa4_8544_5e60c51bdeb8.slice/crio-e90398d80a32a5da4c6e2c46e3fe26c3ee1d689d7993aa142f0946b059af9222 WatchSource:0}: Error finding container e90398d80a32a5da4c6e2c46e3fe26c3ee1d689d7993aa142f0946b059af9222: Status 404 returned error can't find the container with id e90398d80a32a5da4c6e2c46e3fe26c3ee1d689d7993aa142f0946b059af9222 Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.324645 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eeb02b9-cc00-423a-87f6-2c326af45ceb-proxy-tls\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.361468 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 10:42:43.991901307 +0000 UTC Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.391518 4782 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.391659 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config podName:5eeb02b9-cc00-423a-87f6-2c326af45ceb nodeName:}" failed. No retries permitted until 2026-01-30 18:30:48.891634462 +0000 UTC m=+25.160012497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config") pod "machine-config-daemon-p7zdh" (UID: "5eeb02b9-cc00-423a-87f6-2c326af45ceb") : failed to sync configmap cache: timed out waiting for the condition Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405334 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405416 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405468 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmlkw\" (UniqueName: \"kubernetes.io/projected/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-kube-api-access-cmlkw\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405499 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405520 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.405544 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqfhj\" (UniqueName: \"kubernetes.io/projected/327fdbe8-f465-4ab9-9478-c937cb925ca1-kube-api-access-lqfhj\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.406279 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.409419 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.410179 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.410327 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.410430 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.410565 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.410625 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.414643 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.437076 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmlkw\" (UniqueName: \"kubernetes.io/projected/1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e-kube-api-access-cmlkw\") pod \"ovnkube-control-plane-749d76644c-2n49m\" (UID: \"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.506944 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.506997 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqfhj\" (UniqueName: \"kubernetes.io/projected/327fdbe8-f465-4ab9-9478-c937cb925ca1-kube-api-access-lqfhj\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.507443 4782 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.507534 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs podName:327fdbe8-f465-4ab9-9478-c937cb925ca1 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:49.00751399 +0000 UTC m=+25.275892015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs") pod "network-metrics-daemon-d7zh6" (UID: "327fdbe8-f465-4ab9-9478-c937cb925ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.524961 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqfhj\" (UniqueName: \"kubernetes.io/projected/327fdbe8-f465-4ab9-9478-c937cb925ca1-kube-api-access-lqfhj\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.554596 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" Jan 30 18:30:48 crc kubenswrapper[4782]: W0130 18:30:48.566163 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ffea8ef_20b5_4c4a_8f98_ec9522dfc86e.slice/crio-7894db9fbb460115d39bb2a2d21cbc4ce3662e30a47f6d252c4e0bf24b7fafe5 WatchSource:0}: Error finding container 7894db9fbb460115d39bb2a2d21cbc4ce3662e30a47f6d252c4e0bf24b7fafe5: Status 404 returned error can't find the container with id 7894db9fbb460115d39bb2a2d21cbc4ce3662e30a47f6d252c4e0bf24b7fafe5 Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.587023 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9eac7de013572bf01b68f2dd1756e0561401cf2e187cf589a88addc1801ac24b"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.589828 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lc8l9" event={"ID":"4facc72e-1e36-4fa4-8544-5e60c51bdeb8","Type":"ContainerStarted","Data":"451fedeff89001d56b95666bc9120de216d10a685bdb39a597fdc27ce756907b"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.589901 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lc8l9" event={"ID":"4facc72e-1e36-4fa4-8544-5e60c51bdeb8","Type":"ContainerStarted","Data":"e90398d80a32a5da4c6e2c46e3fe26c3ee1d689d7993aa142f0946b059af9222"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.591970 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qdgpq" event={"ID":"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6","Type":"ContainerStarted","Data":"7190924486a58947a7a39b8e6ae7a95007953ffcb2ccc40f7d61e2bf38e80b2c"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.592084 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qdgpq" event={"ID":"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6","Type":"ContainerStarted","Data":"f482b725f2160608a10219ac28d5283c1cfa542b0b5af5f6138bafbaccb05686"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.594298 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="60827fc1d3e0c1444cfec5347f796c0d9513b08eefe43882fcc56ecf80dd2201" exitCode=0 Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.594466 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"60827fc1d3e0c1444cfec5347f796c0d9513b08eefe43882fcc56ecf80dd2201"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.594517 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerStarted","Data":"388327a1d600741d12aaa74712c7343c98fc5591179b6331831e71c6400352b0"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.595576 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" event={"ID":"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e","Type":"ContainerStarted","Data":"7894db9fbb460115d39bb2a2d21cbc4ce3662e30a47f6d252c4e0bf24b7fafe5"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.597158 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" exitCode=0 Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.597552 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.597586 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"3b5b137ae156ad5ef5ed0bf5b934def26c5189163af9d7b46198d06ebd991cda"} Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.607717 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.608015 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.607981807 +0000 UTC m=+28.876359952 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.650615 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lc8l9" podStartSLOduration=3.650590332 podStartE2EDuration="3.650590332s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:48.649942567 +0000 UTC m=+24.918320612" watchObservedRunningTime="2026-01-30 18:30:48.650590332 +0000 UTC m=+24.918968367" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.693445 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qdgpq" podStartSLOduration=3.693419293 podStartE2EDuration="3.693419293s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:48.693096555 +0000 UTC m=+24.961474580" watchObservedRunningTime="2026-01-30 18:30:48.693419293 +0000 UTC m=+24.961797328" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.710562 4782 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.710684 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.710659198 +0000 UTC m=+28.979037243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.710780 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.710914 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.711119 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.711278 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.711747 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.711790 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.711813 4782 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.711862 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.711846786 +0000 UTC m=+28.980224971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712260 4782 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712305 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.712294037 +0000 UTC m=+28.980672082 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712304 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712343 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712366 4782 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:48 crc kubenswrapper[4782]: E0130 18:30:48.712430 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.71241254 +0000 UTC m=+28.980790565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.946357 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.946976 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eeb02b9-cc00-423a-87f6-2c326af45ceb-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7zdh\" (UID: \"5eeb02b9-cc00-423a-87f6-2c326af45ceb\") " pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:48 crc kubenswrapper[4782]: I0130 18:30:48.979483 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:30:48 crc kubenswrapper[4782]: W0130 18:30:48.991724 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eeb02b9_cc00_423a_87f6_2c326af45ceb.slice/crio-28f6e51c6853fa6a79d07ebc959405d87bc847fa440340937e6b6e10768470e9 WatchSource:0}: Error finding container 28f6e51c6853fa6a79d07ebc959405d87bc847fa440340937e6b6e10768470e9: Status 404 returned error can't find the container with id 28f6e51c6853fa6a79d07ebc959405d87bc847fa440340937e6b6e10768470e9 Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.047286 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:49 crc kubenswrapper[4782]: E0130 18:30:49.047480 4782 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:49 crc kubenswrapper[4782]: E0130 18:30:49.047909 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs podName:327fdbe8-f465-4ab9-9478-c937cb925ca1 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:50.047884902 +0000 UTC m=+26.316262927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs") pod "network-metrics-daemon-d7zh6" (UID: "327fdbe8-f465-4ab9-9478-c937cb925ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.362516 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 22:49:47.088207802 +0000 UTC Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.409793 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:49 crc kubenswrapper[4782]: E0130 18:30:49.409945 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.604920 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.604978 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.604989 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.605001 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.605017 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.605050 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.608174 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="c68b61402a53e60fb09733f101678439051b4079ab1a7825e3e0c42689c68ed0" exitCode=0 Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.608254 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"c68b61402a53e60fb09733f101678439051b4079ab1a7825e3e0c42689c68ed0"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.610022 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"26fc4efa63bdb6347518fde54f10a5dcd16cf18a674aecbd4f61b2b5ec6398d9"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.610064 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.610075 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"28f6e51c6853fa6a79d07ebc959405d87bc847fa440340937e6b6e10768470e9"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.611847 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" event={"ID":"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e","Type":"ContainerStarted","Data":"733c412262ecfac10002bc2d884f2845959849f40614d093c7cd6c90c4f6e39b"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.611904 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" event={"ID":"1ffea8ef-20b5-4c4a-8f98-ec9522dfc86e","Type":"ContainerStarted","Data":"367ec76b999697bf067efbacd02527941c4e024c7178455fc30d33a5fcbcfb97"} Jan 30 18:30:49 crc kubenswrapper[4782]: I0130 18:30:49.657263 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podStartSLOduration=4.657220272 podStartE2EDuration="4.657220272s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:49.657029238 +0000 UTC m=+25.925407353" watchObservedRunningTime="2026-01-30 18:30:49.657220272 +0000 UTC m=+25.925598297" Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.059036 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:50 crc kubenswrapper[4782]: E0130 18:30:50.059292 4782 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:50 crc kubenswrapper[4782]: E0130 18:30:50.059394 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs podName:327fdbe8-f465-4ab9-9478-c937cb925ca1 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:52.059369219 +0000 UTC m=+28.327747244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs") pod "network-metrics-daemon-d7zh6" (UID: "327fdbe8-f465-4ab9-9478-c937cb925ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.363209 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:18:25.287062666 +0000 UTC Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.410736 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.410808 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.410913 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:50 crc kubenswrapper[4782]: E0130 18:30:50.410941 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:50 crc kubenswrapper[4782]: E0130 18:30:50.411088 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:50 crc kubenswrapper[4782]: E0130 18:30:50.411273 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.618110 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="17d239557abdfe0e09e34ab009d1fea96984f09f8b10a728d0277700d1681f97" exitCode=0 Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.618353 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"17d239557abdfe0e09e34ab009d1fea96984f09f8b10a728d0277700d1681f97"} Jan 30 18:30:50 crc kubenswrapper[4782]: I0130 18:30:50.649641 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2n49m" podStartSLOduration=5.649611991 podStartE2EDuration="5.649611991s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:49.674735814 +0000 UTC m=+25.943113859" watchObservedRunningTime="2026-01-30 18:30:50.649611991 +0000 UTC m=+26.917990056" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.341134 4782 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.345288 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.345349 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.345371 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.345552 4782 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.357411 4782 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.357865 4782 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.359862 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.359906 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.359927 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.359951 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.359971 4782 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T18:30:51Z","lastTransitionTime":"2026-01-30T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.363444 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 07:56:30.43858272 +0000 UTC Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.381663 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.381728 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.381750 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.381783 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.381806 4782 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T18:30:51Z","lastTransitionTime":"2026-01-30T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.410420 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:51 crc kubenswrapper[4782]: E0130 18:30:51.410660 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.417028 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm"] Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.417721 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.420984 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.421150 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.422311 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.424583 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.473429 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.473557 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.473604 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.473649 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.473767 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575109 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575192 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575498 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575560 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575605 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575827 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.575923 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.576864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.587830 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.595664 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b7d8830-0be8-43ca-b1bd-ca5066eb9caa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7s4gm\" (UID: \"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.625994 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="da115792941d111e250ae2b9ee73453f2d42b4543634c7e4d0e40e284a55dcae" exitCode=0 Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.626081 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"da115792941d111e250ae2b9ee73453f2d42b4543634c7e4d0e40e284a55dcae"} Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.631610 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} Jan 30 18:30:51 crc kubenswrapper[4782]: I0130 18:30:51.750772 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" Jan 30 18:30:51 crc kubenswrapper[4782]: W0130 18:30:51.777448 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b7d8830_0be8_43ca_b1bd_ca5066eb9caa.slice/crio-687b664b78a797dcf297ee8d18f1ccdf8ec24df7f83a827ba9ece91406e7a94d WatchSource:0}: Error finding container 687b664b78a797dcf297ee8d18f1ccdf8ec24df7f83a827ba9ece91406e7a94d: Status 404 returned error can't find the container with id 687b664b78a797dcf297ee8d18f1ccdf8ec24df7f83a827ba9ece91406e7a94d Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.081700 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.081964 4782 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.082193 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs podName:327fdbe8-f465-4ab9-9478-c937cb925ca1 nodeName:}" failed. No retries permitted until 2026-01-30 18:30:56.08217615 +0000 UTC m=+32.350554175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs") pod "network-metrics-daemon-d7zh6" (UID: "327fdbe8-f465-4ab9-9478-c937cb925ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.172410 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.364573 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:04:21.032776631 +0000 UTC Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.364681 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.378185 4782 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.410411 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.410473 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.410538 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.410702 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.410847 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.410969 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.640340 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="4b9c02477627569ff33c80fcf834c9d5fa93ec1ceceacb032e2774cf34ae1efa" exitCode=0 Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.640434 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"4b9c02477627569ff33c80fcf834c9d5fa93ec1ceceacb032e2774cf34ae1efa"} Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.643052 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" event={"ID":"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa","Type":"ContainerStarted","Data":"7f73fb759f0fa9aaee47551fc15efd8e29b58dbad11ef571ce8c6522c602adb8"} Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.643127 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" event={"ID":"6b7d8830-0be8-43ca-b1bd-ca5066eb9caa","Type":"ContainerStarted","Data":"687b664b78a797dcf297ee8d18f1ccdf8ec24df7f83a827ba9ece91406e7a94d"} Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.687900 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.688109 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:00.688071709 +0000 UTC m=+36.956449864 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.789388 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.789461 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.789503 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:52 crc kubenswrapper[4782]: I0130 18:30:52.789532 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789539 4782 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789606 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:31:00.789590722 +0000 UTC m=+37.057968747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789613 4782 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789649 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 18:31:00.789638633 +0000 UTC m=+37.058016658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789772 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789821 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789834 4782 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789879 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789945 4782 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789966 4782 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.789913 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 18:31:00.789889419 +0000 UTC m=+37.058267434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:52 crc kubenswrapper[4782]: E0130 18:30:52.790058 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 18:31:00.790027263 +0000 UTC m=+37.058405508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 18:30:53 crc kubenswrapper[4782]: I0130 18:30:53.409676 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:53 crc kubenswrapper[4782]: E0130 18:30:53.409824 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:53 crc kubenswrapper[4782]: I0130 18:30:53.649524 4782 generic.go:334] "Generic (PLEG): container finished" podID="a030950a-4dec-42c1-a494-6ecd3413b010" containerID="e6e76310f5193eeea5a3e6ac290d99124fec4dd31020b7d7bc6028c869ab6247" exitCode=0 Jan 30 18:30:53 crc kubenswrapper[4782]: I0130 18:30:53.649582 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerDied","Data":"e6e76310f5193eeea5a3e6ac290d99124fec4dd31020b7d7bc6028c869ab6247"} Jan 30 18:30:53 crc kubenswrapper[4782]: I0130 18:30:53.678319 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7s4gm" podStartSLOduration=8.678303495 podStartE2EDuration="8.678303495s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:52.69265686 +0000 UTC m=+28.961034885" watchObservedRunningTime="2026-01-30 18:30:53.678303495 +0000 UTC m=+29.946681520" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.183680 4782 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.410471 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.410590 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.410628 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:54 crc kubenswrapper[4782]: E0130 18:30:54.411264 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:54 crc kubenswrapper[4782]: E0130 18:30:54.411390 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:54 crc kubenswrapper[4782]: E0130 18:30:54.411503 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.660357 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerStarted","Data":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.660924 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.666386 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-47tfl" event={"ID":"a030950a-4dec-42c1-a494-6ecd3413b010","Type":"ContainerStarted","Data":"fb1a927a408c2cf1d8dc0e71ea8bbee24c0f11f55ef9c8da0f413ff499be79b8"} Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.693859 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podStartSLOduration=9.69382465 podStartE2EDuration="9.69382465s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:54.693345568 +0000 UTC m=+30.961723593" watchObservedRunningTime="2026-01-30 18:30:54.69382465 +0000 UTC m=+30.962202705" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.743857 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:54 crc kubenswrapper[4782]: I0130 18:30:54.780339 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-47tfl" podStartSLOduration=9.780308131 podStartE2EDuration="9.780308131s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:30:54.722798927 +0000 UTC m=+30.991176952" watchObservedRunningTime="2026-01-30 18:30:54.780308131 +0000 UTC m=+31.048686156" Jan 30 18:30:55 crc kubenswrapper[4782]: I0130 18:30:55.410408 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:55 crc kubenswrapper[4782]: E0130 18:30:55.410623 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:55 crc kubenswrapper[4782]: I0130 18:30:55.670033 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:55 crc kubenswrapper[4782]: I0130 18:30:55.670698 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:55 crc kubenswrapper[4782]: I0130 18:30:55.702079 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.144838 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:56 crc kubenswrapper[4782]: E0130 18:30:56.145071 4782 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:56 crc kubenswrapper[4782]: E0130 18:30:56.145416 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs podName:327fdbe8-f465-4ab9-9478-c937cb925ca1 nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.145388757 +0000 UTC m=+40.413766792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs") pod "network-metrics-daemon-d7zh6" (UID: "327fdbe8-f465-4ab9-9478-c937cb925ca1") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.228639 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d7zh6"] Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.228830 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:56 crc kubenswrapper[4782]: E0130 18:30:56.228943 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.410664 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.410713 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:56 crc kubenswrapper[4782]: E0130 18:30:56.410827 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:56 crc kubenswrapper[4782]: E0130 18:30:56.410943 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:56 crc kubenswrapper[4782]: I0130 18:30:56.672897 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:57 crc kubenswrapper[4782]: I0130 18:30:57.410533 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:57 crc kubenswrapper[4782]: E0130 18:30:57.410714 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 18:30:57 crc kubenswrapper[4782]: I0130 18:30:57.682774 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.410302 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.410331 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.410441 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:30:58 crc kubenswrapper[4782]: E0130 18:30:58.410544 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 18:30:58 crc kubenswrapper[4782]: E0130 18:30:58.410727 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7zh6" podUID="327fdbe8-f465-4ab9-9478-c937cb925ca1" Jan 30 18:30:58 crc kubenswrapper[4782]: E0130 18:30:58.410982 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.934308 4782 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.934572 4782 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.989716 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.990644 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.991125 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm"] Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.992153 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.992421 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz"] Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.993172 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.993664 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kr9hs"] Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.994655 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.994745 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.994996 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ncmc"] Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.995261 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.995611 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:58 crc kubenswrapper[4782]: I0130 18:30:58.996849 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.000498 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.002718 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.002754 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.003518 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.003781 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.003946 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.003957 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.005387 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.005662 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.007114 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.009164 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.019490 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.019532 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.020275 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.020584 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021101 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021381 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021422 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021556 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021607 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021742 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021772 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021842 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021560 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.021998 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.022284 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.022515 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.022595 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.024770 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.025378 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.026606 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq5gm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.027526 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.029257 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.029654 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.030448 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.033401 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wqdz8"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.034169 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.034340 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.035144 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.037514 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.037908 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.038188 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.038297 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.038540 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.039564 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.039692 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.039786 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040038 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040158 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040315 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040430 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040632 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.040788 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.041256 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.042569 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.042747 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.042745 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.042877 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.043509 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.044014 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.044576 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.045027 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.045282 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.045455 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.047906 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.048337 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v7rdd"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.048599 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4dqmv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.048707 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.048853 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.049019 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.050740 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.050756 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.050741 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.050902 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.050984 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.051091 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.051825 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.052173 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.052583 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kr9hs"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.057082 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079210 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079303 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079403 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079461 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079785 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.079975 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080122 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080152 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080303 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080337 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080368 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ncmc"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080455 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080469 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080569 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080661 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.080884 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.081378 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.082093 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083099 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083175 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-node-pullsecrets\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083212 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083268 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af0115e-9a4c-4c1a-8128-82c14bebf94c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083300 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083331 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b534a5-da04-4cad-86a2-6db5e75da9af-serving-cert\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083356 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-audit-policies\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083379 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083418 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-service-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083447 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083476 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083504 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409b217b-e96d-41d2-a902-b51fcdc07920-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083531 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083556 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083589 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.083617 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.092503 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.092584 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.092707 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093067 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093347 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093606 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093699 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093815 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093881 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.093901 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094025 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094370 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.084421 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-images\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094745 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094777 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-encryption-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094798 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-trusted-ca\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094820 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094844 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-config\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094897 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2lx\" (UniqueName: \"kubernetes.io/projected/6af0115e-9a4c-4c1a-8128-82c14bebf94c-kube-api-access-gj2lx\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094925 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094948 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.094970 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-client\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095012 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-encryption-config\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095043 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095065 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095088 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jmm\" (UniqueName: \"kubernetes.io/projected/995092de-971b-4634-8725-eb2cbc63b926-kube-api-access-s2jmm\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095117 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdlm\" (UniqueName: \"kubernetes.io/projected/261116f6-a031-456e-8119-4288bdb8a201-kube-api-access-ftdlm\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095144 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095165 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095188 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095217 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6tq\" (UniqueName: \"kubernetes.io/projected/34055f4b-7168-4722-8e00-d0de4f823f41-kube-api-access-kx6tq\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095255 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095256 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-etcd-client\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095377 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095391 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-config\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095460 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnkvj\" (UniqueName: \"kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095495 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7klfj\" (UniqueName: \"kubernetes.io/projected/66e5834c-c620-441b-8bf2-ee904388abd4-kube-api-access-7klfj\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095520 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-auth-proxy-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095560 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-image-import-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095607 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095616 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095634 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e5834c-c620-441b-8bf2-ee904388abd4-serving-cert\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095659 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34055f4b-7168-4722-8e00-d0de4f823f41-metrics-tls\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095734 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095753 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095786 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095819 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095867 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095910 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.095937 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096024 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096072 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwns\" (UniqueName: \"kubernetes.io/projected/7f0188d2-4498-4e3d-8f80-f036de2bd8df-kube-api-access-hqwns\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096085 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096102 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096127 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409b217b-e96d-41d2-a902-b51fcdc07920-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096157 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096186 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/261116f6-a031-456e-8119-4288bdb8a201-serving-cert\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096209 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096252 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096327 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096355 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096377 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096417 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqh7m\" (UniqueName: \"kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096442 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/261116f6-a031-456e-8119-4288bdb8a201-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096530 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj44p\" (UniqueName: \"kubernetes.io/projected/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-kube-api-access-pj44p\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096576 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af0115e-9a4c-4c1a-8128-82c14bebf94c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096614 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096640 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-config\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096672 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/328b2a40-069b-4eda-b7ae-38f62b5a192a-kube-api-access-6qkvn\") pod \"downloads-7954f5f757-4dqmv\" (UID: \"328b2a40-069b-4eda-b7ae-38f62b5a192a\") " pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096717 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096750 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit-dir\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096774 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ldcc\" (UniqueName: \"kubernetes.io/projected/f7b534a5-da04-4cad-86a2-6db5e75da9af-kube-api-access-2ldcc\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096804 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-serving-cert\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096862 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flvgz\" (UniqueName: \"kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096900 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmsl\" (UniqueName: \"kubernetes.io/projected/6cc01c21-5f0e-4251-a777-a758380c9a4f-kube-api-access-rdmsl\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096937 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/995092de-971b-4634-8725-eb2cbc63b926-audit-dir\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.096987 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tdm\" (UniqueName: \"kubernetes.io/projected/3b9097e3-f69b-49ae-9781-52921de78625-kube-api-access-n8tdm\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097006 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097022 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097049 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f28s\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-kube-api-access-4f28s\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097080 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097105 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f0188d2-4498-4e3d-8f80-f036de2bd8df-machine-approver-tls\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097142 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9097e3-f69b-49ae-9781-52921de78625-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097172 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097170 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-serving-cert\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.097460 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjwb\" (UniqueName: \"kubernetes.io/projected/409b217b-e96d-41d2-a902-b51fcdc07920-kube-api-access-rqjwb\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.098299 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.099366 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.101434 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.102352 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq5gm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.106852 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.107900 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.108526 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.111110 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.111179 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.117191 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.117776 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.117872 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.121424 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.122112 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.122759 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wqdz8"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.122862 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.123216 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.123711 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.124407 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.125400 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-k7vd9"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.125762 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.125947 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.126095 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.126463 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.126960 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.129295 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.130485 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.149043 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hfj2v"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.160363 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.163151 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.163405 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.163502 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdvw5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.164511 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.166821 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.169721 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.170220 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.170982 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.172651 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.176815 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.181203 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.182592 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.183432 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.183819 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.184989 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.185274 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.186088 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.186340 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.186720 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.187206 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.187795 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.188959 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.190422 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.191989 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.193116 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.193681 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.195163 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.196398 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zlh6p"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.197120 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9w2tq"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.197333 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.198899 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201611 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jmm\" (UniqueName: \"kubernetes.io/projected/995092de-971b-4634-8725-eb2cbc63b926-kube-api-access-s2jmm\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201649 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdlm\" (UniqueName: \"kubernetes.io/projected/261116f6-a031-456e-8119-4288bdb8a201-kube-api-access-ftdlm\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201682 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201701 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201722 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201741 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6tq\" (UniqueName: \"kubernetes.io/projected/34055f4b-7168-4722-8e00-d0de4f823f41-kube-api-access-kx6tq\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201764 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-etcd-client\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201781 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201801 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-config\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201817 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnkvj\" (UniqueName: \"kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201840 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-images\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201859 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-auth-proxy-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201878 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-image-import-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201899 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201916 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7klfj\" (UniqueName: \"kubernetes.io/projected/66e5834c-c620-441b-8bf2-ee904388abd4-kube-api-access-7klfj\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201933 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34055f4b-7168-4722-8e00-d0de4f823f41-metrics-tls\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201962 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201982 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.201998 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202014 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e5834c-c620-441b-8bf2-ee904388abd4-serving-cert\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202034 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202048 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202074 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202094 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202112 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwns\" (UniqueName: \"kubernetes.io/projected/7f0188d2-4498-4e3d-8f80-f036de2bd8df-kube-api-access-hqwns\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202130 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409b217b-e96d-41d2-a902-b51fcdc07920-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202150 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202168 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/261116f6-a031-456e-8119-4288bdb8a201-serving-cert\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202186 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202207 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202245 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202292 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202314 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202339 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqh7m\" (UniqueName: \"kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202358 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/261116f6-a031-456e-8119-4288bdb8a201-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202375 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202396 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj44p\" (UniqueName: \"kubernetes.io/projected/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-kube-api-access-pj44p\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202418 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af0115e-9a4c-4c1a-8128-82c14bebf94c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202435 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-config\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202454 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/328b2a40-069b-4eda-b7ae-38f62b5a192a-kube-api-access-6qkvn\") pod \"downloads-7954f5f757-4dqmv\" (UID: \"328b2a40-069b-4eda-b7ae-38f62b5a192a\") " pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202471 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202498 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202517 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202535 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ldcc\" (UniqueName: \"kubernetes.io/projected/f7b534a5-da04-4cad-86a2-6db5e75da9af-kube-api-access-2ldcc\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202551 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-serving-cert\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202576 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flvgz\" (UniqueName: \"kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202591 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit-dir\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202608 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmsl\" (UniqueName: \"kubernetes.io/projected/6cc01c21-5f0e-4251-a777-a758380c9a4f-kube-api-access-rdmsl\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202624 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/995092de-971b-4634-8725-eb2cbc63b926-audit-dir\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202642 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202659 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f28s\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-kube-api-access-4f28s\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202676 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202695 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f0188d2-4498-4e3d-8f80-f036de2bd8df-machine-approver-tls\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202713 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9097e3-f69b-49ae-9781-52921de78625-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202731 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tdm\" (UniqueName: \"kubernetes.io/projected/3b9097e3-f69b-49ae-9781-52921de78625-kube-api-access-n8tdm\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202751 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-serving-cert\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202769 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjwb\" (UniqueName: \"kubernetes.io/projected/409b217b-e96d-41d2-a902-b51fcdc07920-kube-api-access-rqjwb\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202781 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202787 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnql\" (UniqueName: \"kubernetes.io/projected/83fd126e-ad11-4c51-a93f-3a7faefdf653-kube-api-access-nhnql\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202858 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202884 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-node-pullsecrets\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202911 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202935 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af0115e-9a4c-4c1a-8128-82c14bebf94c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202958 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.202985 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83fd126e-ad11-4c51-a93f-3a7faefdf653-proxy-tls\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203005 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-audit-policies\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203028 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203055 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b534a5-da04-4cad-86a2-6db5e75da9af-serving-cert\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203077 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-service-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203103 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203123 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203150 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203173 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409b217b-e96d-41d2-a902-b51fcdc07920-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203218 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203257 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203276 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-images\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203295 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203317 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-encryption-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203337 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-trusted-ca\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203356 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203375 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203394 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-config\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203431 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2lx\" (UniqueName: \"kubernetes.io/projected/6af0115e-9a4c-4c1a-8128-82c14bebf94c-kube-api-access-gj2lx\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203453 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203473 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203490 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-encryption-config\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203513 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.203532 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-client\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.204242 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.204857 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.205473 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/409b217b-e96d-41d2-a902-b51fcdc07920-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.210054 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.210538 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.211087 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-config\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.211374 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b9097e3-f69b-49ae-9781-52921de78625-images\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.212110 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.212519 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-node-pullsecrets\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.214885 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-trusted-ca\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.215302 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.215854 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.215859 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f0188d2-4498-4e3d-8f80-f036de2bd8df-auth-proxy-config\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.215913 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.218419 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-image-import-ca\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.218907 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.219198 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-audit-policies\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.221610 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.222994 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-config\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.224181 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7mn9q"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.225439 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.225680 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.226209 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j25hm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.227065 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5xjfh"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.228571 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2zmw5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.230821 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.232328 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.233936 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.242812 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.235318 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.235915 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6cc01c21-5f0e-4251-a777-a758380c9a4f-audit-dir\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.236408 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/995092de-971b-4634-8725-eb2cbc63b926-audit-dir\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.237276 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/995092de-971b-4634-8725-eb2cbc63b926-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.243182 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b534a5-da04-4cad-86a2-6db5e75da9af-serving-cert\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.237338 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.238963 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-encryption-config\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.238967 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/261116f6-a031-456e-8119-4288bdb8a201-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.238469 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239475 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.216369 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc01c21-5f0e-4251-a777-a758380c9a4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239728 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239668 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239814 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239990 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b534a5-da04-4cad-86a2-6db5e75da9af-service-ca-bundle\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.240170 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244042 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6af0115e-9a4c-4c1a-8128-82c14bebf94c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.240726 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.239874 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.242245 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244384 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244570 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244601 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4dqmv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244620 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244633 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244643 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244656 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244667 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v7rdd"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244679 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244694 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdvw5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244703 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.240492 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9w2tq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.244973 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-serving-cert\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.245075 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af0115e-9a4c-4c1a-8128-82c14bebf94c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.240848 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34055f4b-7168-4722-8e00-d0de4f823f41-metrics-tls\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.245139 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/409b217b-e96d-41d2-a902-b51fcdc07920-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.235116 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/995092de-971b-4634-8725-eb2cbc63b926-etcd-client\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.242831 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66e5834c-c620-441b-8bf2-ee904388abd4-config\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.245956 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f0188d2-4498-4e3d-8f80-f036de2bd8df-machine-approver-tls\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.246298 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.247216 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.248496 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-etcd-client\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.247506 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.248587 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-serving-cert\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.245986 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.247568 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6cc01c21-5f0e-4251-a777-a758380c9a4f-encryption-config\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.249612 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.252576 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.252867 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.249247 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.254208 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66e5834c-c620-441b-8bf2-ee904388abd4-serving-cert\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.254410 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.254793 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b9097e3-f69b-49ae-9781-52921de78625-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.256738 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.257110 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/261116f6-a031-456e-8119-4288bdb8a201-serving-cert\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.257424 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.257522 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.258105 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.259086 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hfj2v"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.260086 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.261060 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.262350 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.263325 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9w2tq"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.264436 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.265557 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zlh6p"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.267863 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.268883 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j25hm"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.270019 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.271036 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.272905 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.273422 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.274631 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2zmw5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.275624 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.276652 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.277642 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h"] Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.305030 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-images\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.305307 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.305468 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83fd126e-ad11-4c51-a93f-3a7faefdf653-proxy-tls\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.305514 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnql\" (UniqueName: \"kubernetes.io/projected/83fd126e-ad11-4c51-a93f-3a7faefdf653-kube-api-access-nhnql\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.306425 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-auth-proxy-config\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.313220 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.333915 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.354614 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.372721 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.376735 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.376937 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.377142 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.377151 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.377443 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.378393 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.379428 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.379663 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.403685 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.407871 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/83fd126e-ad11-4c51-a93f-3a7faefdf653-images\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.409768 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.413804 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.434261 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.452889 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.458729 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83fd126e-ad11-4c51-a93f-3a7faefdf653-proxy-tls\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.473216 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.493769 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.513462 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.533561 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.552603 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.573553 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.593243 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.612934 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.634875 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.653152 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.674562 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.693168 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.714296 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.733296 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.753714 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.773008 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.793522 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.815109 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.834394 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.853775 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.873968 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.893665 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.913899 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.932633 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.953787 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.982059 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 18:30:59 crc kubenswrapper[4782]: I0130 18:30:59.992939 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.012845 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.032712 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.053849 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.073467 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.113887 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.133725 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.154218 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.173725 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.191419 4782 request.go:700] Waited for 1.007563304s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.193745 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.214631 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.233163 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.253557 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.273714 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.294036 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.313643 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.343408 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.352749 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.373836 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.393475 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.410610 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.410663 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.410740 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.412699 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.432967 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.452835 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.474551 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.493986 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.514421 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.534129 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.554713 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.575813 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.615147 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdlm\" (UniqueName: \"kubernetes.io/projected/261116f6-a031-456e-8119-4288bdb8a201-kube-api-access-ftdlm\") pod \"openshift-config-operator-7777fb866f-rbvgr\" (UID: \"261116f6-a031-456e-8119-4288bdb8a201\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.636652 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jmm\" (UniqueName: \"kubernetes.io/projected/995092de-971b-4634-8725-eb2cbc63b926-kube-api-access-s2jmm\") pod \"apiserver-7bbb656c7d-bl25w\" (UID: \"995092de-971b-4634-8725-eb2cbc63b926\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.650213 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.681604 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2lx\" (UniqueName: \"kubernetes.io/projected/6af0115e-9a4c-4c1a-8128-82c14bebf94c-kube-api-access-gj2lx\") pod \"openshift-controller-manager-operator-756b6f6bc6-zbxkw\" (UID: \"6af0115e-9a4c-4c1a-8128-82c14bebf94c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.683831 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.695578 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6tq\" (UniqueName: \"kubernetes.io/projected/34055f4b-7168-4722-8e00-d0de4f823f41-kube-api-access-kx6tq\") pod \"dns-operator-744455d44c-6ncmc\" (UID: \"34055f4b-7168-4722-8e00-d0de4f823f41\") " pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.707894 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7klfj\" (UniqueName: \"kubernetes.io/projected/66e5834c-c620-441b-8bf2-ee904388abd4-kube-api-access-7klfj\") pod \"console-operator-58897d9998-v7rdd\" (UID: \"66e5834c-c620-441b-8bf2-ee904388abd4\") " pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.729825 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:00 crc kubenswrapper[4782]: E0130 18:31:00.730061 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:16.730027818 +0000 UTC m=+52.998405863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.733756 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.739271 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnkvj\" (UniqueName: \"kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj\") pod \"console-f9d7485db-hcttm\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.776113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqh7m\" (UniqueName: \"kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m\") pod \"controller-manager-879f6c89f-xmfnp\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.798464 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.800950 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b\") pod \"route-controller-manager-6576b87f9c-x4wsm\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.806874 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.810146 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj44p\" (UniqueName: \"kubernetes.io/projected/72c78b6a-9bc1-46c4-bac8-6fc43df61b5a-kube-api-access-pj44p\") pod \"cluster-samples-operator-665b6dd947-2clrq\" (UID: \"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.811869 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.828122 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.831168 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.831220 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.831336 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.831379 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.831875 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.835190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ldcc\" (UniqueName: \"kubernetes.io/projected/f7b534a5-da04-4cad-86a2-6db5e75da9af-kube-api-access-2ldcc\") pod \"authentication-operator-69f744f599-wqdz8\" (UID: \"f7b534a5-da04-4cad-86a2-6db5e75da9af\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.852838 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwns\" (UniqueName: \"kubernetes.io/projected/7f0188d2-4498-4e3d-8f80-f036de2bd8df-kube-api-access-hqwns\") pod \"machine-approver-56656f9798-nlpdm\" (UID: \"7f0188d2-4498-4e3d-8f80-f036de2bd8df\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.855435 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.871407 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flvgz\" (UniqueName: \"kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz\") pod \"oauth-openshift-558db77b4-gsv8k\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:31:00 crc kubenswrapper[4782]: W0130 18:31:00.885908 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f0188d2_4498_4e3d_8f80_f036de2bd8df.slice/crio-c23e0c5eec4269cb20c1619912a58f316b02815c0e3a08582641a14bd96c5186 WatchSource:0}: Error finding container c23e0c5eec4269cb20c1619912a58f316b02815c0e3a08582641a14bd96c5186: Status 404 returned error can't find the container with id c23e0c5eec4269cb20c1619912a58f316b02815c0e3a08582641a14bd96c5186 Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.891691 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmsl\" (UniqueName: \"kubernetes.io/projected/6cc01c21-5f0e-4251-a777-a758380c9a4f-kube-api-access-rdmsl\") pod \"apiserver-76f77b778f-fq5gm\" (UID: \"6cc01c21-5f0e-4251-a777-a758380c9a4f\") " pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.906983 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw"] Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.919751 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f28s\" (UniqueName: \"kubernetes.io/projected/78b4a110-27ac-4a1a-9b2e-091f29aeaf52-kube-api-access-4f28s\") pod \"cluster-image-registry-operator-dc59b4c8b-4vxvv\" (UID: \"78b4a110-27ac-4a1a-9b2e-091f29aeaf52\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.924958 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.933887 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tdm\" (UniqueName: \"kubernetes.io/projected/3b9097e3-f69b-49ae-9781-52921de78625-kube-api-access-n8tdm\") pod \"machine-api-operator-5694c8668f-kr9hs\" (UID: \"3b9097e3-f69b-49ae-9781-52921de78625\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.948394 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjwb\" (UniqueName: \"kubernetes.io/projected/409b217b-e96d-41d2-a902-b51fcdc07920-kube-api-access-rqjwb\") pod \"openshift-apiserver-operator-796bbdcf4f-rj6vz\" (UID: \"409b217b-e96d-41d2-a902-b51fcdc07920\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.962545 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.974470 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.976977 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkvn\" (UniqueName: \"kubernetes.io/projected/328b2a40-069b-4eda-b7ae-38f62b5a192a-kube-api-access-6qkvn\") pod \"downloads-7954f5f757-4dqmv\" (UID: \"328b2a40-069b-4eda-b7ae-38f62b5a192a\") " pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.990038 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.993361 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 18:31:00 crc kubenswrapper[4782]: I0130 18:31:00.997765 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.011065 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.013000 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.039774 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.041592 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.057732 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.073625 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.091834 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.095944 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.115590 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.120300 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.136167 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.138739 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.155881 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.174017 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.176178 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.192107 4782 request.go:700] Waited for 1.939777163s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dservice-ca-dockercfg-pn86c&limit=500&resourceVersion=0 Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.193810 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.202064 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.210818 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.213839 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.234806 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.252879 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.274366 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.296681 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.305100 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.308948 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.314816 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.333357 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.341883 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.344704 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.344752 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-v7rdd"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.364654 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.386544 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.394976 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.417687 4782 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.442499 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4dqmv"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.473085 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.474820 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnql\" (UniqueName: \"kubernetes.io/projected/83fd126e-ad11-4c51-a93f-3a7faefdf653-kube-api-access-nhnql\") pod \"machine-config-operator-74547568cd-6mr8h\" (UID: \"83fd126e-ad11-4c51-a93f-3a7faefdf653\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.487473 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6ncmc"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.499192 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.506842 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.506933 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.535427 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541461 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89966112-dfc8-4611-8fbc-f18b58dca7e6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541576 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a981a6c-a999-4661-89fc-4c1e04c6fcac-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541610 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-service-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541685 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c326d3fd-0db9-4682-a522-31bc619a2e5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541704 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541734 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541750 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c326d3fd-0db9-4682-a522-31bc619a2e5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541784 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541831 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541848 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-default-certificate\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541879 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a981a6c-a999-4661-89fc-4c1e04c6fcac-config\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541917 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-service-ca-bundle\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541969 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-config\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.541986 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5tbf\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542024 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q89j2\" (UniqueName: \"kubernetes.io/projected/50209ca4-3459-4af3-8f4c-a377c053e65f-kube-api-access-q89j2\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542081 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89966112-dfc8-4611-8fbc-f18b58dca7e6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542120 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-client\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542172 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.542191 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmldq\" (UniqueName: \"kubernetes.io/projected/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-kube-api-access-vmldq\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.546974 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-stats-auth\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547016 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h9pj\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-kube-api-access-7h9pj\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547077 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547106 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-serving-cert\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547382 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pjd\" (UniqueName: \"kubernetes.io/projected/115be544-7346-440e-b0bd-04a31a98da70-kube-api-access-x2pjd\") pod \"migrator-59844c95c7-tqqqd\" (UID: \"115be544-7346-440e-b0bd-04a31a98da70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547486 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89966112-dfc8-4611-8fbc-f18b58dca7e6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547518 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50209ca4-3459-4af3-8f4c-a377c053e65f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547565 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547589 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-metrics-certs\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547684 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4c9c21a-67d6-4e7b-af12-671a2a108a57-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547710 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9c21a-67d6-4e7b-af12-671a2a108a57-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547771 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb6d8\" (UniqueName: \"kubernetes.io/projected/536ab9b3-58fe-47c6-9917-d985d8a986eb-kube-api-access-tb6d8\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547891 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547955 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a981a6c-a999-4661-89fc-4c1e04c6fcac-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.547984 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9c21a-67d6-4e7b-af12-671a2a108a57-config\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.550085 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.555117 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.05509799 +0000 UTC m=+38.323476015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.558698 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.558776 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.565448 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.572916 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.596716 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.598342 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq5gm"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.598964 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.621531 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kr9hs"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.637370 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.648420 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649100 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q89j2\" (UniqueName: \"kubernetes.io/projected/50209ca4-3459-4af3-8f4c-a377c053e65f-kube-api-access-q89j2\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649179 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-srv-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649216 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa83ad46-833b-4087-a185-83a9dee9a31c-proxy-tls\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649266 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-csi-data-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649286 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-profile-collector-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649306 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89966112-dfc8-4611-8fbc-f18b58dca7e6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649412 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649438 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-mountpoint-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649455 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649473 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmldq\" (UniqueName: \"kubernetes.io/projected/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-kube-api-access-vmldq\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649511 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-client\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649529 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-serving-cert\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-stats-auth\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649760 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h9pj\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-kube-api-access-7h9pj\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649784 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-config\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649844 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649863 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxqjq\" (UniqueName: \"kubernetes.io/projected/2f8b3777-ad72-4e02-b6ac-3664c32415b5-kube-api-access-hxqjq\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649888 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649912 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-serving-cert\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649934 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-webhook-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649954 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5796n\" (UniqueName: \"kubernetes.io/projected/06d5ad24-7f40-4e85-8f67-b55196c538d7-kube-api-access-5796n\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649977 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f8b3777-ad72-4e02-b6ac-3664c32415b5-config-volume\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.649998 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h679v\" (UniqueName: \"kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650021 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-srv-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650043 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f8b3777-ad72-4e02-b6ac-3664c32415b5-metrics-tls\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650062 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e52cd5-7405-4bc4-b27d-6663543f2c60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650087 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86pv6\" (UniqueName: \"kubernetes.io/projected/5b76d69f-7ca3-4cb0-a795-b106479a0b50-kube-api-access-86pv6\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650110 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pjd\" (UniqueName: \"kubernetes.io/projected/115be544-7346-440e-b0bd-04a31a98da70-kube-api-access-x2pjd\") pod \"migrator-59844c95c7-tqqqd\" (UID: \"115be544-7346-440e-b0bd-04a31a98da70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650134 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89966112-dfc8-4611-8fbc-f18b58dca7e6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.650303 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50209ca4-3459-4af3-8f4c-a377c053e65f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.655806 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.155749762 +0000 UTC m=+38.424127787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.655963 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/50209ca4-3459-4af3-8f4c-a377c053e65f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.657194 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89966112-dfc8-4611-8fbc-f18b58dca7e6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.657363 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.659988 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-stats-auth\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.663702 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-client\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667675 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-metrics-certs\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667721 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4c9c21a-67d6-4e7b-af12-671a2a108a57-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667750 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9c21a-67d6-4e7b-af12-671a2a108a57-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667788 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-cabundle\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667820 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-plugins-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667854 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a178dd-c047-41eb-9d74-e3ae83797a89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667893 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667924 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb6d8\" (UniqueName: \"kubernetes.io/projected/536ab9b3-58fe-47c6-9917-d985d8a986eb-kube-api-access-tb6d8\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.667973 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668010 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a981a6c-a999-4661-89fc-4c1e04c6fcac-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668044 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9c21a-67d6-4e7b-af12-671a2a108a57-config\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668081 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa83ad46-833b-4087-a185-83a9dee9a31c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668110 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qbk\" (UniqueName: \"kubernetes.io/projected/f3a178dd-c047-41eb-9d74-e3ae83797a89-kube-api-access-f8qbk\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668147 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sspsj\" (UniqueName: \"kubernetes.io/projected/aa83ad46-833b-4087-a185-83a9dee9a31c-kube-api-access-sspsj\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668191 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89966112-dfc8-4611-8fbc-f18b58dca7e6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668258 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kvsb\" (UniqueName: \"kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668297 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668332 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668368 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a981a6c-a999-4661-89fc-4c1e04c6fcac-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-node-bootstrap-token\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668445 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-service-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668474 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2lg\" (UniqueName: \"kubernetes.io/projected/c1fb0642-d2bd-43f7-962a-761f1469df24-kube-api-access-xg2lg\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668511 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668547 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2j8\" (UniqueName: \"kubernetes.io/projected/fcdb741e-92ca-479b-a935-9bf1962bf7e5-kube-api-access-lc2j8\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668580 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668641 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c326d3fd-0db9-4682-a522-31bc619a2e5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668869 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668922 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.668958 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcdb741e-92ca-479b-a935-9bf1962bf7e5-tmpfs\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669000 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-certs\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669082 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669118 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c326d3fd-0db9-4682-a522-31bc619a2e5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7npv\" (UniqueName: \"kubernetes.io/projected/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-kube-api-access-p7npv\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669189 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669221 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbpzc\" (UniqueName: \"kubernetes.io/projected/2fb13904-c3df-4314-a955-cc4be2026b0c-kube-api-access-dbpzc\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669329 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74d4b963-83f0-4327-b969-ec69902f92a6-cert\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669362 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-registration-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669392 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669418 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcvjr\" (UniqueName: \"kubernetes.io/projected/a2e52cd5-7405-4bc4-b27d-6663543f2c60-kube-api-access-tcvjr\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669439 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-socket-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669465 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-default-certificate\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669487 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a178dd-c047-41eb-9d74-e3ae83797a89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669506 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcx7k\" (UniqueName: \"kubernetes.io/projected/910b0df0-af99-41ed-8fc2-b4c781979c56-kube-api-access-hcx7k\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669533 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a981a6c-a999-4661-89fc-4c1e04c6fcac-config\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669558 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lbj\" (UniqueName: \"kubernetes.io/projected/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-kube-api-access-v8lbj\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669580 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh57z\" (UniqueName: \"kubernetes.io/projected/74d4b963-83f0-4327-b969-ec69902f92a6-kube-api-access-gh57z\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669605 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-service-ca-bundle\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669629 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669657 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8jwr\" (UniqueName: \"kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669690 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-config\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669711 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-key\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669729 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.669754 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5tbf\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.672402 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.675972 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c9c21a-67d6-4e7b-af12-671a2a108a57-config\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.672929 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.172909055 +0000 UTC m=+38.441287070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.674045 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.675190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-metrics-certs\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.676563 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a981a6c-a999-4661-89fc-4c1e04c6fcac-config\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.676989 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.674150 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-etcd-service-ca\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.678916 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-default-certificate\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.679867 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a981a6c-a999-4661-89fc-4c1e04c6fcac-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.681615 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/536ab9b3-58fe-47c6-9917-d985d8a986eb-config\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.681794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-service-ca-bundle\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.681929 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.682793 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/536ab9b3-58fe-47c6-9917-d985d8a986eb-serving-cert\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.683295 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4c9c21a-67d6-4e7b-af12-671a2a108a57-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.685937 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c326d3fd-0db9-4682-a522-31bc619a2e5a-trusted-ca\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.689570 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.693321 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.695501 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q89j2\" (UniqueName: \"kubernetes.io/projected/50209ca4-3459-4af3-8f4c-a377c053e65f-kube-api-access-q89j2\") pod \"multus-admission-controller-857f4d67dd-hfj2v\" (UID: \"50209ca4-3459-4af3-8f4c-a377c053e65f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.700381 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c326d3fd-0db9-4682-a522-31bc619a2e5a-metrics-tls\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.703346 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89966112-dfc8-4611-8fbc-f18b58dca7e6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.708692 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.710070 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wqdz8"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.710628 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h9pj\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-kube-api-access-7h9pj\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.722445 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcttm" event={"ID":"22efd41f-5357-4820-afa4-09733ef60db0","Type":"ContainerStarted","Data":"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.722504 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcttm" event={"ID":"22efd41f-5357-4820-afa4-09733ef60db0","Type":"ContainerStarted","Data":"a035f06032de2eb5afd1078c88dae055d70bbe165679851dc18a62f3ea624e8a"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.730351 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" event={"ID":"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a","Type":"ContainerStarted","Data":"03fa1a2e85ea3e9e6318368c9014f4be27b2203051a56828dd9f398d40cbbea8"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.730428 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" event={"ID":"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a","Type":"ContainerStarted","Data":"0fc571988206de20458616bca86affc656233296e93970a99855b44eebf6d35d"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.734686 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmldq\" (UniqueName: \"kubernetes.io/projected/7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa-kube-api-access-vmldq\") pod \"router-default-5444994796-k7vd9\" (UID: \"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa\") " pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.739305 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4dqmv" event={"ID":"328b2a40-069b-4eda-b7ae-38f62b5a192a","Type":"ContainerStarted","Data":"53e91e29a56cd9dc2a14d66cf9885e26f815efe7d235f7f9406a0436c8a35144"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.740976 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" event={"ID":"059df750-c2da-429e-bae4-c7271be158af","Type":"ContainerStarted","Data":"82d0fe0831385987ff1153df90293e9515a66987a7b29f2da49b2c7a726f816b"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.742924 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" event={"ID":"3b9097e3-f69b-49ae-9781-52921de78625","Type":"ContainerStarted","Data":"44b48f7a1cf032fbcd3002af24ed1ba8ca3c363a24d015bcd18f96a078cbb1cd"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.752211 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.753747 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" event={"ID":"6e4804cf-00c0-4598-9254-c5c424b013c2","Type":"ContainerStarted","Data":"08d773ca39566ae2c56c91e635802de7c7ddc86cd75bda0d6cb6fc8d4b81df12"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.753809 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" event={"ID":"6e4804cf-00c0-4598-9254-c5c424b013c2","Type":"ContainerStarted","Data":"556ff1980a2954b5b1f7261b1c6750bab42af385dd0e11551a974912148cb9b4"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.754033 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.756979 4782 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xmfnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.757058 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.761304 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.761656 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv"] Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.767471 4782 generic.go:334] "Generic (PLEG): container finished" podID="261116f6-a031-456e-8119-4288bdb8a201" containerID="94351fea6953908c716a1fb54bec66c0eae169563de42ae24c95ffb735360750" exitCode=0 Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.767580 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" event={"ID":"261116f6-a031-456e-8119-4288bdb8a201","Type":"ContainerDied","Data":"94351fea6953908c716a1fb54bec66c0eae169563de42ae24c95ffb735360750"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.767613 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" event={"ID":"261116f6-a031-456e-8119-4288bdb8a201","Type":"ContainerStarted","Data":"dbfcfb4951c32f2043c0716abe6c772137fe23b0cf3322d0fa257ab439991501"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771620 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.771785 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.271759594 +0000 UTC m=+38.540137619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771835 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771871 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8jwr\" (UniqueName: \"kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771900 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771937 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-key\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771962 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-srv-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.771986 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-csi-data-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772013 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa83ad46-833b-4087-a185-83a9dee9a31c-proxy-tls\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772041 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-profile-collector-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772065 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-mountpoint-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772088 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772109 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-serving-cert\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772135 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772160 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxqjq\" (UniqueName: \"kubernetes.io/projected/2f8b3777-ad72-4e02-b6ac-3664c32415b5-kube-api-access-hxqjq\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772181 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-config\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772207 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-webhook-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772248 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5796n\" (UniqueName: \"kubernetes.io/projected/06d5ad24-7f40-4e85-8f67-b55196c538d7-kube-api-access-5796n\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772273 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f8b3777-ad72-4e02-b6ac-3664c32415b5-config-volume\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772299 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-srv-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772324 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h679v\" (UniqueName: \"kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772353 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f8b3777-ad72-4e02-b6ac-3664c32415b5-metrics-tls\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772382 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e52cd5-7405-4bc4-b27d-6663543f2c60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772423 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86pv6\" (UniqueName: \"kubernetes.io/projected/5b76d69f-7ca3-4cb0-a795-b106479a0b50-kube-api-access-86pv6\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772470 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-cabundle\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772506 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a178dd-c047-41eb-9d74-e3ae83797a89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772535 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772571 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-plugins-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772619 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772651 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa83ad46-833b-4087-a185-83a9dee9a31c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772677 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qbk\" (UniqueName: \"kubernetes.io/projected/f3a178dd-c047-41eb-9d74-e3ae83797a89-kube-api-access-f8qbk\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772714 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sspsj\" (UniqueName: \"kubernetes.io/projected/aa83ad46-833b-4087-a185-83a9dee9a31c-kube-api-access-sspsj\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772745 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kvsb\" (UniqueName: \"kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772773 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772798 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772831 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-node-bootstrap-token\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772863 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772891 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2j8\" (UniqueName: \"kubernetes.io/projected/fcdb741e-92ca-479b-a935-9bf1962bf7e5-kube-api-access-lc2j8\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.772916 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2lg\" (UniqueName: \"kubernetes.io/projected/c1fb0642-d2bd-43f7-962a-761f1469df24-kube-api-access-xg2lg\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773081 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773111 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773156 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcdb741e-92ca-479b-a935-9bf1962bf7e5-tmpfs\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773185 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-certs\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773214 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7npv\" (UniqueName: \"kubernetes.io/projected/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-kube-api-access-p7npv\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773281 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbpzc\" (UniqueName: \"kubernetes.io/projected/2fb13904-c3df-4314-a955-cc4be2026b0c-kube-api-access-dbpzc\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773306 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74d4b963-83f0-4327-b969-ec69902f92a6-cert\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773330 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-registration-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773358 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcvjr\" (UniqueName: \"kubernetes.io/projected/a2e52cd5-7405-4bc4-b27d-6663543f2c60-kube-api-access-tcvjr\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773385 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-socket-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773413 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcx7k\" (UniqueName: \"kubernetes.io/projected/910b0df0-af99-41ed-8fc2-b4c781979c56-kube-api-access-hcx7k\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773442 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a178dd-c047-41eb-9d74-e3ae83797a89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773477 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8lbj\" (UniqueName: \"kubernetes.io/projected/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-kube-api-access-v8lbj\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773490 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.773506 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh57z\" (UniqueName: \"kubernetes.io/projected/74d4b963-83f0-4327-b969-ec69902f92a6-kube-api-access-gh57z\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.774539 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-cabundle\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.775180 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a178dd-c047-41eb-9d74-e3ae83797a89-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.775323 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.775803 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" event={"ID":"995092de-971b-4634-8725-eb2cbc63b926","Type":"ContainerStarted","Data":"c1fc86d2ac8b2285e34b43c8d5d532de5821144e8d1efdc9b0ebb92ce960fde7"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.776196 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-socket-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.776326 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-registration-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.776338 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-plugins-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.776459 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.276438676 +0000 UTC m=+38.544816701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.776685 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.777200 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aa83ad46-833b-4087-a185-83a9dee9a31c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.777935 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-apiservice-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.780780 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-config\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.781010 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-csi-data-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.781024 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5b76d69f-7ca3-4cb0-a795-b106479a0b50-mountpoint-dir\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.781032 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74d4b963-83f0-4327-b969-ec69902f92a6-cert\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.781510 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fcdb741e-92ca-479b-a935-9bf1962bf7e5-tmpfs\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.781795 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fb13904-c3df-4314-a955-cc4be2026b0c-signing-key\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.783629 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f8b3777-ad72-4e02-b6ac-3664c32415b5-config-volume\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.784614 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" event={"ID":"34055f4b-7168-4722-8e00-d0de4f823f41","Type":"ContainerStarted","Data":"302997c7e443c8e8a215e4b01dec203ad3cc94f813087045768e4a94b5cd5757"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.784667 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.784980 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.785540 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.786255 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" event={"ID":"6cc01c21-5f0e-4251-a777-a758380c9a4f","Type":"ContainerStarted","Data":"e2c691dcc5c0649e6fde1715dc438a7808513e802ac13605739fccd54eacd3f4"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.787668 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-serving-cert\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.787878 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.788409 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-node-bootstrap-token\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.788495 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.788707 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/06d5ad24-7f40-4e85-8f67-b55196c538d7-certs\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.788710 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" event={"ID":"f0e4561f-3acb-40aa-86fe-9fb86a840e31","Type":"ContainerStarted","Data":"6a7a4c929ffc1bae1ab30059cf19a8b50e583907785398bf1fd12072d8fbce73"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.788771 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" event={"ID":"f0e4561f-3acb-40aa-86fe-9fb86a840e31","Type":"ContainerStarted","Data":"8ab7dc58c6394ffc1b0fb910dc0ca63b431ec77b10225f0d25ebee6263f5089e"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.789004 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-profile-collector-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.789266 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.789781 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pjd\" (UniqueName: \"kubernetes.io/projected/115be544-7346-440e-b0bd-04a31a98da70-kube-api-access-x2pjd\") pod \"migrator-59844c95c7-tqqqd\" (UID: \"115be544-7346-440e-b0bd-04a31a98da70\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.790291 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.791470 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e52cd5-7405-4bc4-b27d-6663543f2c60-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.792915 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c1fb0642-d2bd-43f7-962a-761f1469df24-srv-cert\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.795598 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f8b3777-ad72-4e02-b6ac-3664c32415b5-metrics-tls\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.795647 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcdb741e-92ca-479b-a935-9bf1962bf7e5-webhook-cert\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.795938 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa83ad46-833b-4087-a185-83a9dee9a31c-proxy-tls\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.796055 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a178dd-c047-41eb-9d74-e3ae83797a89-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.796210 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/910b0df0-af99-41ed-8fc2-b4c781979c56-srv-cert\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.796928 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" event={"ID":"6af0115e-9a4c-4c1a-8128-82c14bebf94c","Type":"ContainerStarted","Data":"cb204f00fb06cb16805c3029c3e1fee5505720a6ac9e7f241d3d6b3c5b7a9eda"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.797070 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" event={"ID":"6af0115e-9a4c-4c1a-8128-82c14bebf94c","Type":"ContainerStarted","Data":"da72dc038999d6b490642584dad187c55abc31ed07d9e150a84a4f3d4009654b"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.801469 4782 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-x4wsm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.801520 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.805946 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.813545 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5tbf\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.818609 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.825573 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" event={"ID":"409b217b-e96d-41d2-a902-b51fcdc07920","Type":"ContainerStarted","Data":"4e937912ffdf99dcf8495a114a26a948ce5fb28cc74036d924e6266e9978bd58"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.838073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb6d8\" (UniqueName: \"kubernetes.io/projected/536ab9b3-58fe-47c6-9917-d985d8a986eb-kube-api-access-tb6d8\") pod \"etcd-operator-b45778765-hdvw5\" (UID: \"536ab9b3-58fe-47c6-9917-d985d8a986eb\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.852026 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4c9c21a-67d6-4e7b-af12-671a2a108a57-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-297fm\" (UID: \"b4c9c21a-67d6-4e7b-af12-671a2a108a57\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.856060 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" event={"ID":"7f0188d2-4498-4e3d-8f80-f036de2bd8df","Type":"ContainerStarted","Data":"ab735cc1d2db015e2ef216eb416040a38a39940914e8532eab573441cf756bd8"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.856130 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" event={"ID":"7f0188d2-4498-4e3d-8f80-f036de2bd8df","Type":"ContainerStarted","Data":"c23e0c5eec4269cb20c1619912a58f316b02815c0e3a08582641a14bd96c5186"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.859311 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" event={"ID":"66e5834c-c620-441b-8bf2-ee904388abd4","Type":"ContainerStarted","Data":"a00950fe518e1f99c2925d10175850092232b8bf9b95cb781c653b7cf5c8b230"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.859340 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" event={"ID":"66e5834c-c620-441b-8bf2-ee904388abd4","Type":"ContainerStarted","Data":"4971905c9cb4358ab7621ed1585eb2370c477a4df9954a00ce8f17c9ba5c8ed4"} Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.860099 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.867685 4782 patch_prober.go:28] interesting pod/console-operator-58897d9998-v7rdd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.867766 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" podUID="66e5834c-c620-441b-8bf2-ee904388abd4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.874203 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89966112-dfc8-4611-8fbc-f18b58dca7e6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lptd9\" (UID: \"89966112-dfc8-4611-8fbc-f18b58dca7e6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.874855 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.875927 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.375891209 +0000 UTC m=+38.644269374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.888893 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0a981a6c-a999-4661-89fc-4c1e04c6fcac-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nsxlx\" (UID: \"0a981a6c-a999-4661-89fc-4c1e04c6fcac\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.915223 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c326d3fd-0db9-4682-a522-31bc619a2e5a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-k8wwb\" (UID: \"c326d3fd-0db9-4682-a522-31bc619a2e5a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.950107 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86pv6\" (UniqueName: \"kubernetes.io/projected/5b76d69f-7ca3-4cb0-a795-b106479a0b50-kube-api-access-86pv6\") pod \"csi-hostpathplugin-2zmw5\" (UID: \"5b76d69f-7ca3-4cb0-a795-b106479a0b50\") " pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.975584 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh57z\" (UniqueName: \"kubernetes.io/projected/74d4b963-83f0-4327-b969-ec69902f92a6-kube-api-access-gh57z\") pod \"ingress-canary-zlh6p\" (UID: \"74d4b963-83f0-4327-b969-ec69902f92a6\") " pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.977579 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:01 crc kubenswrapper[4782]: E0130 18:31:01.979437 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.47941757 +0000 UTC m=+38.747795595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.982207 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" Jan 30 18:31:01 crc kubenswrapper[4782]: I0130 18:31:01.994513 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2j8\" (UniqueName: \"kubernetes.io/projected/fcdb741e-92ca-479b-a935-9bf1962bf7e5-kube-api-access-lc2j8\") pod \"packageserver-d55dfcdfc-5g8vk\" (UID: \"fcdb741e-92ca-479b-a935-9bf1962bf7e5\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:02 crc kubenswrapper[4782]: W0130 18:31:02.000577 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4c0305269e51e8f05e5e356b2ffaefa85037b77ae594dc251159996f02fd8e86 WatchSource:0}: Error finding container 4c0305269e51e8f05e5e356b2ffaefa85037b77ae594dc251159996f02fd8e86: Status 404 returned error can't find the container with id 4c0305269e51e8f05e5e356b2ffaefa85037b77ae594dc251159996f02fd8e86 Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.019421 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2lg\" (UniqueName: \"kubernetes.io/projected/c1fb0642-d2bd-43f7-962a-761f1469df24-kube-api-access-xg2lg\") pod \"catalog-operator-68c6474976-fppfn\" (UID: \"c1fb0642-d2bd-43f7-962a-761f1469df24\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.045389 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.054004 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.055196 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8jwr\" (UniqueName: \"kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr\") pod \"cni-sysctl-allowlist-ds-5xjfh\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.070863 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sspsj\" (UniqueName: \"kubernetes.io/projected/aa83ad46-833b-4087-a185-83a9dee9a31c-kube-api-access-sspsj\") pod \"machine-config-controller-84d6567774-h4vm5\" (UID: \"aa83ad46-833b-4087-a185-83a9dee9a31c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.071817 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.076930 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.080015 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.080250 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.580207355 +0000 UTC m=+38.848585380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.080912 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.081654 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.58162857 +0000 UTC m=+38.850006595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.094916 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcx7k\" (UniqueName: \"kubernetes.io/projected/910b0df0-af99-41ed-8fc2-b4c781979c56-kube-api-access-hcx7k\") pod \"olm-operator-6b444d44fb-fwm2h\" (UID: \"910b0df0-af99-41ed-8fc2-b4c781979c56\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.096587 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7npv\" (UniqueName: \"kubernetes.io/projected/dd0f947d-ef9a-43ea-a5a0-7fe20d429739-kube-api-access-p7npv\") pod \"control-plane-machine-set-operator-78cbb6b69f-df42d\" (UID: \"dd0f947d-ef9a-43ea-a5a0-7fe20d429739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.107471 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbpzc\" (UniqueName: \"kubernetes.io/projected/2fb13904-c3df-4314-a955-cc4be2026b0c-kube-api-access-dbpzc\") pod \"service-ca-9c57cc56f-j25hm\" (UID: \"2fb13904-c3df-4314-a955-cc4be2026b0c\") " pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.129762 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.131328 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qbk\" (UniqueName: \"kubernetes.io/projected/f3a178dd-c047-41eb-9d74-e3ae83797a89-kube-api-access-f8qbk\") pod \"kube-storage-version-migrator-operator-b67b599dd-hn494\" (UID: \"f3a178dd-c047-41eb-9d74-e3ae83797a89\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.139395 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.151700 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.160577 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.161902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcvjr\" (UniqueName: \"kubernetes.io/projected/a2e52cd5-7405-4bc4-b27d-6663543f2c60-kube-api-access-tcvjr\") pod \"package-server-manager-789f6589d5-mdlt5\" (UID: \"a2e52cd5-7405-4bc4-b27d-6663543f2c60\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.171988 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.178209 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxqjq\" (UniqueName: \"kubernetes.io/projected/2f8b3777-ad72-4e02-b6ac-3664c32415b5-kube-api-access-hxqjq\") pod \"dns-default-9w2tq\" (UID: \"2f8b3777-ad72-4e02-b6ac-3664c32415b5\") " pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.186265 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.186899 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.187060 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.687038596 +0000 UTC m=+38.955416621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.187141 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.187498 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.687490017 +0000 UTC m=+38.955868042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.194145 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.199734 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8lbj\" (UniqueName: \"kubernetes.io/projected/eb668ab0-eba2-4a3e-97fe-74703c1bb2cc-kube-api-access-v8lbj\") pod \"service-ca-operator-777779d784-mvxx7\" (UID: \"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.202439 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.210785 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zlh6p" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.231933 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.244706 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.247588 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.259028 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.270958 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kvsb\" (UniqueName: \"kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb\") pod \"collect-profiles-29496630-tq7nr\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.276489 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5796n\" (UniqueName: \"kubernetes.io/projected/06d5ad24-7f40-4e85-8f67-b55196c538d7-kube-api-access-5796n\") pod \"machine-config-server-7mn9q\" (UID: \"06d5ad24-7f40-4e85-8f67-b55196c538d7\") " pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.288837 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h679v\" (UniqueName: \"kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v\") pod \"marketplace-operator-79b997595-46dsj\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.295991 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.315534 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.315579 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.317209 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.817153307 +0000 UTC m=+39.085531332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.317291 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.317704 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.81768944 +0000 UTC m=+39.086067465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.423998 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.424222 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:02.924198072 +0000 UTC m=+39.192576097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.480933 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.519221 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hfj2v"] Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.520447 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7mn9q" Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.525537 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.525840 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.025827348 +0000 UTC m=+39.294205373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.534139 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h"] Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.626370 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.626917 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.12689756 +0000 UTC m=+39.395275585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: W0130 18:31:02.663300 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50209ca4_3459_4af3_8f4c_a377c053e65f.slice/crio-17f3f490a65d83fb58331b9a99f454129fff98da2746e340f9e1e28ea6ba4b1f WatchSource:0}: Error finding container 17f3f490a65d83fb58331b9a99f454129fff98da2746e340f9e1e28ea6ba4b1f: Status 404 returned error can't find the container with id 17f3f490a65d83fb58331b9a99f454129fff98da2746e340f9e1e28ea6ba4b1f Jan 30 18:31:02 crc kubenswrapper[4782]: W0130 18:31:02.726079 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83fd126e_ad11_4c51_a93f_3a7faefdf653.slice/crio-efae3a15630a54cde7805ef6b4955bce6d013f428d63ea1326740e08258ea66e WatchSource:0}: Error finding container efae3a15630a54cde7805ef6b4955bce6d013f428d63ea1326740e08258ea66e: Status 404 returned error can't find the container with id efae3a15630a54cde7805ef6b4955bce6d013f428d63ea1326740e08258ea66e Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.728112 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.729472 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.229454397 +0000 UTC m=+39.497832422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.836622 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.837810 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.337787634 +0000 UTC m=+39.606165659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.901122 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" event={"ID":"72c78b6a-9bc1-46c4-bac8-6fc43df61b5a","Type":"ContainerStarted","Data":"6b7f810a9c188cfe404eb8081d7b79fac02feccc9fc4f1c899aa7d1f3ad043f0"} Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.942140 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:02 crc kubenswrapper[4782]: E0130 18:31:02.942615 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.442601566 +0000 UTC m=+39.710979591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.945991 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" event={"ID":"409b217b-e96d-41d2-a902-b51fcdc07920","Type":"ContainerStarted","Data":"b12d8faf4c12b3cdcf7f2f58f5cbdadd8c3aa2bc528e11d3878e07bf34f63a0f"} Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.954142 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b6e5a81a11467514a30a007b57ebbb146f77fd5c50058179c6e54eac9aaaac24"} Jan 30 18:31:02 crc kubenswrapper[4782]: I0130 18:31:02.999043 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" event={"ID":"7f0188d2-4498-4e3d-8f80-f036de2bd8df","Type":"ContainerStarted","Data":"dfc7c74fdb37cf33384635c0cf993815888e3d2c9c53b4e116a50386d8ef8804"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.001093 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" podStartSLOduration=18.001059272 podStartE2EDuration="18.001059272s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:02.978372487 +0000 UTC m=+39.246750512" watchObservedRunningTime="2026-01-30 18:31:03.001059272 +0000 UTC m=+39.269437307" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.027870 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" event={"ID":"78b4a110-27ac-4a1a-9b2e-091f29aeaf52","Type":"ContainerStarted","Data":"d63a1165ee2bcf6a0f49100047081a1bf2077c911d40ebbe7073cce82f14a62c"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.027954 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" event={"ID":"78b4a110-27ac-4a1a-9b2e-091f29aeaf52","Type":"ContainerStarted","Data":"2ad4582d830e71072464bd15622241bbe2389e0bc02a9c1371deb09dcfb4be8f"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.037722 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k7vd9" event={"ID":"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa","Type":"ContainerStarted","Data":"b4836adda9cd767bc5ffa18e5550e3dbdee94e9893fc963cd7789d3c831f53b3"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.043156 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.043784 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.54376067 +0000 UTC m=+39.812138695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.046485 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" event={"ID":"343fced1-cf4b-4d48-9a25-df17b608e09e","Type":"ContainerStarted","Data":"a73fd03297bd2cbef752c9abfa5674cc212696cfc51a74bc2aa7bbc6a924786b"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.052597 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" event={"ID":"261116f6-a031-456e-8119-4288bdb8a201","Type":"ContainerStarted","Data":"4a7c5ea146b8c34eea88c59a4628c011cfa5a88a741bf7ba43afb2767ef69309"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.052661 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.058705 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" event={"ID":"34055f4b-7168-4722-8e00-d0de4f823f41","Type":"ContainerStarted","Data":"8ea78904964b6a410bbbe10130c270f46b7c30267a9c12752a9d5dd471f6b919"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.069262 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"16e2492ed47806ec2e05ddb8988f336f1a84b0c8a5d373afca2eb858d44c6346"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.069331 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4c0305269e51e8f05e5e356b2ffaefa85037b77ae594dc251159996f02fd8e86"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.071676 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4dqmv" event={"ID":"328b2a40-069b-4eda-b7ae-38f62b5a192a","Type":"ContainerStarted","Data":"b82ac76b86feb9e6b7cad7256c3c1bd64eaeafe27664b73017d6a98ecf9b7ec8"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.072158 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.073880 4782 generic.go:334] "Generic (PLEG): container finished" podID="6cc01c21-5f0e-4251-a777-a758380c9a4f" containerID="65ae6dfdd6a6816cf1f3af9048f4688c12da706fbf2ce9f7576c7c0fb1b98159" exitCode=0 Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.073955 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" event={"ID":"6cc01c21-5f0e-4251-a777-a758380c9a4f","Type":"ContainerDied","Data":"65ae6dfdd6a6816cf1f3af9048f4688c12da706fbf2ce9f7576c7c0fb1b98159"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.074888 4782 patch_prober.go:28] interesting pod/downloads-7954f5f757-4dqmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.074941 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4dqmv" podUID="328b2a40-069b-4eda-b7ae-38f62b5a192a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.075214 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"24193693ce24eaf0571821d960c8a8041fbe0a38e93ec44b7a53bf83e8d4e72c"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.076915 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" event={"ID":"3b9097e3-f69b-49ae-9781-52921de78625","Type":"ContainerStarted","Data":"0c7cc7031574aa9098597345fc3485d79c33b03b0b646afea07bae1efc69baa0"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.081497 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" event={"ID":"059df750-c2da-429e-bae4-c7271be158af","Type":"ContainerStarted","Data":"8a83dfb7af9effd6c233a77b6add42ce363a718a8b333e71ebd1f1d115a41859"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.082115 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.082852 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" event={"ID":"83fd126e-ad11-4c51-a93f-3a7faefdf653","Type":"ContainerStarted","Data":"efae3a15630a54cde7805ef6b4955bce6d013f428d63ea1326740e08258ea66e"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.085348 4782 generic.go:334] "Generic (PLEG): container finished" podID="995092de-971b-4634-8725-eb2cbc63b926" containerID="d79cd3981a60b74e1470b7bfbd2ba6de14f5f93cb23328f9723acad872ae1a13" exitCode=0 Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.086616 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" event={"ID":"995092de-971b-4634-8725-eb2cbc63b926","Type":"ContainerDied","Data":"d79cd3981a60b74e1470b7bfbd2ba6de14f5f93cb23328f9723acad872ae1a13"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.089758 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" event={"ID":"f7b534a5-da04-4cad-86a2-6db5e75da9af","Type":"ContainerStarted","Data":"d3a5b66003c67b36dc640b0cbf716b925eeb6a9d28cfce17288b9e91d07159ef"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.089789 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" event={"ID":"f7b534a5-da04-4cad-86a2-6db5e75da9af","Type":"ContainerStarted","Data":"40b2553bcdac09f62a0bb5d5cc716a8d3d7876a9404c85cdebabf089860fa238"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.090300 4782 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gsv8k container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.090758 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" podUID="059df750-c2da-429e-bae4-c7271be158af" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.094362 4782 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xmfnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.094422 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.094460 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" event={"ID":"50209ca4-3459-4af3-8f4c-a377c053e65f","Type":"ContainerStarted","Data":"17f3f490a65d83fb58331b9a99f454129fff98da2746e340f9e1e28ea6ba4b1f"} Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.097079 4782 patch_prober.go:28] interesting pod/console-operator-58897d9998-v7rdd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.097102 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" podUID="66e5834c-c620-441b-8bf2-ee904388abd4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.100257 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.151623 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" podStartSLOduration=18.151607045 podStartE2EDuration="18.151607045s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:03.148622183 +0000 UTC m=+39.417000218" watchObservedRunningTime="2026-01-30 18:31:03.151607045 +0000 UTC m=+39.419985070" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.152195 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.156411 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.65639475 +0000 UTC m=+39.924772775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: W0130 18:31:03.172600 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d5ad24_7f40_4e85_8f67_b55196c538d7.slice/crio-75bd7da6534e4b312e8242926efaf985edb69e116bb0b7263aa9cf0f474cdd52 WatchSource:0}: Error finding container 75bd7da6534e4b312e8242926efaf985edb69e116bb0b7263aa9cf0f474cdd52: Status 404 returned error can't find the container with id 75bd7da6534e4b312e8242926efaf985edb69e116bb0b7263aa9cf0f474cdd52 Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.235104 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.255436 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.255587 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.755563746 +0000 UTC m=+40.023941771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.256293 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.269654 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.769636685 +0000 UTC m=+40.038014710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.279426 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" podStartSLOduration=18.27940521 podStartE2EDuration="18.27940521s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:03.279209455 +0000 UTC m=+39.547587480" watchObservedRunningTime="2026-01-30 18:31:03.27940521 +0000 UTC m=+39.547783235" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.312144 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.332625 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2zmw5"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.338518 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.344179 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hdvw5"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.358853 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.359262 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.859244731 +0000 UTC m=+40.127622756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: W0130 18:31:03.364677 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89966112_dfc8_4611_8fbc_f18b58dca7e6.slice/crio-e4dfd40346e58258270c1ec2f198fae0984cfe12e9e6d3dfc1cceafe0c272162 WatchSource:0}: Error finding container e4dfd40346e58258270c1ec2f198fae0984cfe12e9e6d3dfc1cceafe0c272162: Status 404 returned error can't find the container with id e4dfd40346e58258270c1ec2f198fae0984cfe12e9e6d3dfc1cceafe0c272162 Jan 30 18:31:03 crc kubenswrapper[4782]: W0130 18:31:03.420799 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod536ab9b3_58fe_47c6_9917_d985d8a986eb.slice/crio-170040ee8886d86252b3889ee82e8b975ea01972038f01042601cfb715e38aad WatchSource:0}: Error finding container 170040ee8886d86252b3889ee82e8b975ea01972038f01042601cfb715e38aad: Status 404 returned error can't find the container with id 170040ee8886d86252b3889ee82e8b975ea01972038f01042601cfb715e38aad Jan 30 18:31:03 crc kubenswrapper[4782]: W0130 18:31:03.440920 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4c9c21a_67d6_4e7b_af12_671a2a108a57.slice/crio-64ecaa7b7b27d7206a7105d0e6489dd58536c794fc75221135bf994a2d65efe5 WatchSource:0}: Error finding container 64ecaa7b7b27d7206a7105d0e6489dd58536c794fc75221135bf994a2d65efe5: Status 404 returned error can't find the container with id 64ecaa7b7b27d7206a7105d0e6489dd58536c794fc75221135bf994a2d65efe5 Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.460896 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.461565 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:03.961546823 +0000 UTC m=+40.229924848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.558928 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zlh6p"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.561604 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.562651 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.062634335 +0000 UTC m=+40.331012360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.596452 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.626071 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.654211 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zbxkw" podStartSLOduration=18.654189468 podStartE2EDuration="18.654189468s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:03.636092452 +0000 UTC m=+39.904470477" watchObservedRunningTime="2026-01-30 18:31:03.654189468 +0000 UTC m=+39.922567493" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.654986 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.664189 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.664990 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.164976677 +0000 UTC m=+40.433354692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.765673 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.766171 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.266140272 +0000 UTC m=+40.534518297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: W0130 18:31:03.779154 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc326d3fd_0db9_4682_a522_31bc619a2e5a.slice/crio-a2e45dcd1b5e270dfa39f1a3a95f6dbf4905ad4de39a36255938a9e4b797f60a WatchSource:0}: Error finding container a2e45dcd1b5e270dfa39f1a3a95f6dbf4905ad4de39a36255938a9e4b797f60a: Status 404 returned error can't find the container with id a2e45dcd1b5e270dfa39f1a3a95f6dbf4905ad4de39a36255938a9e4b797f60a Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.796682 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nlpdm" podStartSLOduration=18.796664486 podStartE2EDuration="18.796664486s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:03.795520319 +0000 UTC m=+40.063898344" watchObservedRunningTime="2026-01-30 18:31:03.796664486 +0000 UTC m=+40.065042511" Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.852197 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9w2tq"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.868777 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.869268 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.369252493 +0000 UTC m=+40.637630518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.878067 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.887679 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.894893 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.934667 4782 csr.go:261] certificate signing request csr-v84q2 is approved, waiting to be issued Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.943976 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5"] Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.949404 4782 csr.go:257] certificate signing request csr-v84q2 is issued Jan 30 18:31:03 crc kubenswrapper[4782]: I0130 18:31:03.981201 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:03 crc kubenswrapper[4782]: E0130 18:31:03.981623 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.481607065 +0000 UTC m=+40.749985090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.087997 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.088522 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.588505547 +0000 UTC m=+40.856883562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: W0130 18:31:04.095086 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa83ad46_833b_4087_a185_83a9dee9a31c.slice/crio-ffa7f70505ca47de14ab11d775674cbc7ea8589d4e475cc8df6292d990b64422 WatchSource:0}: Error finding container ffa7f70505ca47de14ab11d775674cbc7ea8589d4e475cc8df6292d990b64422: Status 404 returned error can't find the container with id ffa7f70505ca47de14ab11d775674cbc7ea8589d4e475cc8df6292d990b64422 Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.159511 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" event={"ID":"536ab9b3-58fe-47c6-9917-d985d8a986eb","Type":"ContainerStarted","Data":"170040ee8886d86252b3889ee82e8b975ea01972038f01042601cfb715e38aad"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.190400 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.194079 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.195024 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.198782 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" event={"ID":"3b9097e3-f69b-49ae-9781-52921de78625","Type":"ContainerStarted","Data":"9182aec9122e1baadc32838b6c352e405fa62a38953690e724a47b9e121a0389"} Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.201444 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.701403054 +0000 UTC m=+40.969781079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.212292 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/327fdbe8-f465-4ab9-9478-c937cb925ca1-metrics-certs\") pod \"network-metrics-daemon-d7zh6\" (UID: \"327fdbe8-f465-4ab9-9478-c937cb925ca1\") " pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.213294 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.228201 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.256324 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j25hm"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.256414 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" event={"ID":"fcdb741e-92ca-479b-a935-9bf1962bf7e5","Type":"ContainerStarted","Data":"fa5ef2ef60c776197c6059ef80996790695f43f33763404a5bd694957f08b294"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.280203 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9w2tq" event={"ID":"2f8b3777-ad72-4e02-b6ac-3664c32415b5","Type":"ContainerStarted","Data":"cb86d44175e71ba6cf35be60721407436b5e261cf09cab1fd47957e51558446c"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.282813 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.286624 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" podStartSLOduration=19.286595903 podStartE2EDuration="19.286595903s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.227943892 +0000 UTC m=+40.496321917" watchObservedRunningTime="2026-01-30 18:31:04.286595903 +0000 UTC m=+40.554973929" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.286728 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr"] Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.296797 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.298953 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.79893747 +0000 UTC m=+41.067315495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: W0130 18:31:04.300600 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1fb0642_d2bd_43f7_962a_761f1469df24.slice/crio-4c362a52a05708a42f61836a193d174705c0ac7d673a1a0e4594f78aa32ae851 WatchSource:0}: Error finding container 4c362a52a05708a42f61836a193d174705c0ac7d673a1a0e4594f78aa32ae851: Status 404 returned error can't find the container with id 4c362a52a05708a42f61836a193d174705c0ac7d673a1a0e4594f78aa32ae851 Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.302912 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" event={"ID":"50209ca4-3459-4af3-8f4c-a377c053e65f","Type":"ContainerStarted","Data":"ebd7c63352f16273d9ba0ad55d2f1766078d73939e1cf2ed4ca9c5a92a8d6585"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.339411 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" event={"ID":"b4c9c21a-67d6-4e7b-af12-671a2a108a57","Type":"ContainerStarted","Data":"64ecaa7b7b27d7206a7105d0e6489dd58536c794fc75221135bf994a2d65efe5"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.360056 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7zh6" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.377974 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4dqmv" podStartSLOduration=19.377949732 podStartE2EDuration="19.377949732s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.340141542 +0000 UTC m=+40.608519577" watchObservedRunningTime="2026-01-30 18:31:04.377949732 +0000 UTC m=+40.646327757" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.398794 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.399330 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:04.899308566 +0000 UTC m=+41.167686591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.455614 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b3b60e44a039b2396e8dbdc1785a709dcb3f1ce45e7e16b27d6ea672869f6789"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.466385 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vxvv" podStartSLOduration=19.466364869 podStartE2EDuration="19.466364869s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.377570222 +0000 UTC m=+40.645948247" watchObservedRunningTime="2026-01-30 18:31:04.466364869 +0000 UTC m=+40.734742894" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.503998 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.504502 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.004482376 +0000 UTC m=+41.272860401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.532605 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" event={"ID":"34055f4b-7168-4722-8e00-d0de4f823f41","Type":"ContainerStarted","Data":"b4d51a82af8d9e435a3e4f42739b4194e4873a7be5fb1a3a6958fd0e5f634efd"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.534586 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2clrq" podStartSLOduration=19.534537209 podStartE2EDuration="19.534537209s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.466458951 +0000 UTC m=+40.734836966" watchObservedRunningTime="2026-01-30 18:31:04.534537209 +0000 UTC m=+40.802915234" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.562480 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mn9q" event={"ID":"06d5ad24-7f40-4e85-8f67-b55196c538d7","Type":"ContainerStarted","Data":"3000d815dde95b607cbaeebd4282fde1ec23278f61f2730359455f7f68dd4a00"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.562537 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7mn9q" event={"ID":"06d5ad24-7f40-4e85-8f67-b55196c538d7","Type":"ContainerStarted","Data":"75bd7da6534e4b312e8242926efaf985edb69e116bb0b7263aa9cf0f474cdd52"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.594488 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zlh6p" event={"ID":"74d4b963-83f0-4327-b969-ec69902f92a6","Type":"ContainerStarted","Data":"58dc13d156d6bbb0439331a24a0101a4b048b27ca4aed218096248716e3d9760"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.604959 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.606222 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.106192543 +0000 UTC m=+41.374570568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.608967 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rj6vz" podStartSLOduration=19.60894113 podStartE2EDuration="19.60894113s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.558095196 +0000 UTC m=+40.826473221" watchObservedRunningTime="2026-01-30 18:31:04.60894113 +0000 UTC m=+40.877319155" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.635732 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" event={"ID":"89966112-dfc8-4611-8fbc-f18b58dca7e6","Type":"ContainerStarted","Data":"e4dfd40346e58258270c1ec2f198fae0984cfe12e9e6d3dfc1cceafe0c272162"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.648860 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" event={"ID":"343fced1-cf4b-4d48-9a25-df17b608e09e","Type":"ContainerStarted","Data":"54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.649587 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.657846 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wqdz8" podStartSLOduration=19.657829066 podStartE2EDuration="19.657829066s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.61063579 +0000 UTC m=+40.879013815" watchObservedRunningTime="2026-01-30 18:31:04.657829066 +0000 UTC m=+40.926207081" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.666471 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" podStartSLOduration=19.666448743 podStartE2EDuration="19.666448743s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.655377347 +0000 UTC m=+40.923755392" watchObservedRunningTime="2026-01-30 18:31:04.666448743 +0000 UTC m=+40.934826758" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.689617 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k7vd9" event={"ID":"7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa","Type":"ContainerStarted","Data":"a2ed29a0c07a673b30191671d501c039a704f1299a7c6adccc463c03d8f52a21"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.711037 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.721305 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.221267992 +0000 UTC m=+41.489646017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.722066 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hcttm" podStartSLOduration=19.722047661 podStartE2EDuration="19.722047661s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.706709002 +0000 UTC m=+40.975087027" watchObservedRunningTime="2026-01-30 18:31:04.722047661 +0000 UTC m=+40.990425686" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.768379 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" event={"ID":"83fd126e-ad11-4c51-a93f-3a7faefdf653","Type":"ContainerStarted","Data":"33caf330b077b77419d94fa84c55de291d9b49d0c9ea855137f01bf9e0725305"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.800697 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podStartSLOduration=5.800672543 podStartE2EDuration="5.800672543s" podCreationTimestamp="2026-01-30 18:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.770057936 +0000 UTC m=+41.038435961" watchObservedRunningTime="2026-01-30 18:31:04.800672543 +0000 UTC m=+41.069050568" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.808325 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" event={"ID":"115be544-7346-440e-b0bd-04a31a98da70","Type":"ContainerStarted","Data":"7314ef092275635d94d421196f5a28cfb36eb947722a576a1339b39d4a576cfb"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.808600 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" event={"ID":"115be544-7346-440e-b0bd-04a31a98da70","Type":"ContainerStarted","Data":"df77e6495b35a2e351c55387de0e0507d8ae2a7b3e243bcb767f999968ebddfd"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.821948 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7mn9q" podStartSLOduration=5.821914634 podStartE2EDuration="5.821914634s" podCreationTimestamp="2026-01-30 18:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.797052736 +0000 UTC m=+41.065430761" watchObservedRunningTime="2026-01-30 18:31:04.821914634 +0000 UTC m=+41.090292659" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.830565 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.831484 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.831675 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.331655858 +0000 UTC m=+41.600033883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.833064 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.833539 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.333530194 +0000 UTC m=+41.601908209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.835565 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fbcc7cb0e6343e4b5431a178de6aedec4a0e3937027bc7b6d0620b4927bc20e2"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.835694 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.842944 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6ncmc" podStartSLOduration=19.842892919 podStartE2EDuration="19.842892919s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.827420047 +0000 UTC m=+41.095798062" watchObservedRunningTime="2026-01-30 18:31:04.842892919 +0000 UTC m=+41.111270954" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.846062 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" event={"ID":"5b76d69f-7ca3-4cb0-a795-b106479a0b50","Type":"ContainerStarted","Data":"0e1f624fceae65e863c07a22c07557f750f4967b467e3c056f9ffbfb0b6d8675"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.849433 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" event={"ID":"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc","Type":"ContainerStarted","Data":"4ed6c2e84561eab0ca5926f17ebf3b0c9c161188aa55b52d229e29397aa30181"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.851052 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:04 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:04 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:04 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.851103 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.852828 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" event={"ID":"c326d3fd-0db9-4682-a522-31bc619a2e5a","Type":"ContainerStarted","Data":"a2e45dcd1b5e270dfa39f1a3a95f6dbf4905ad4de39a36255938a9e4b797f60a"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.881050 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" event={"ID":"f3a178dd-c047-41eb-9d74-e3ae83797a89","Type":"ContainerStarted","Data":"73ed3b539108c6e572f07c8908e3aa3ca5ab563a209c85fce5dbff98623452b1"} Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.882085 4782 patch_prober.go:28] interesting pod/downloads-7954f5f757-4dqmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.882170 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4dqmv" podUID="328b2a40-069b-4eda-b7ae-38f62b5a192a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.906193 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kr9hs" podStartSLOduration=19.906161031 podStartE2EDuration="19.906161031s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.90610263 +0000 UTC m=+41.174480655" watchObservedRunningTime="2026-01-30 18:31:04.906161031 +0000 UTC m=+41.174539056" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.918135 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-v7rdd" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.942138 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-k7vd9" podStartSLOduration=19.942117556 podStartE2EDuration="19.942117556s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:04.939984755 +0000 UTC m=+41.208362780" watchObservedRunningTime="2026-01-30 18:31:04.942117556 +0000 UTC m=+41.210495581" Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.948084 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:04 crc kubenswrapper[4782]: E0130 18:31:04.949389 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.449370711 +0000 UTC m=+41.717748736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.950842 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 18:26:03 +0000 UTC, rotation deadline is 2026-11-19 00:36:37.267690583 +0000 UTC Jan 30 18:31:04 crc kubenswrapper[4782]: I0130 18:31:04.950875 4782 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7014h5m32.316817626s for next certificate rotation Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.050156 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.054536 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.554512391 +0000 UTC m=+41.822890606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.151168 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" podStartSLOduration=20.151140536 podStartE2EDuration="20.151140536s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:05.087567316 +0000 UTC m=+41.355945341" watchObservedRunningTime="2026-01-30 18:31:05.151140536 +0000 UTC m=+41.419518561" Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.152024 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.152519 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.652496749 +0000 UTC m=+41.920874774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.196085 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d7zh6"] Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.254179 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.254659 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.754638926 +0000 UTC m=+42.023016951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.356013 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.356473 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.856451966 +0000 UTC m=+42.124829991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.458877 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.459254 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:05.959239909 +0000 UTC m=+42.227617934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.563865 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.564289 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.064267126 +0000 UTC m=+42.332645151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.665398 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.666003 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.165980794 +0000 UTC m=+42.434358809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.772014 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.772394 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.272367454 +0000 UTC m=+42.540745479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.773653 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.819221 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:05 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:05 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:05 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.819330 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.876270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.876662 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.376648693 +0000 UTC m=+42.645026718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.921130 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerStarted","Data":"19b0128bdcbd7bf3b00278907e2c933e0b328d5fd135eb748e767f048e24b320"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.940260 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" event={"ID":"f3a178dd-c047-41eb-9d74-e3ae83797a89","Type":"ContainerStarted","Data":"ed1eb7bfff9e8f871b8861e87a693ded1b4d129cc26de9ee9f5f33d6b909ead9"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.949217 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" event={"ID":"0a981a6c-a999-4661-89fc-4c1e04c6fcac","Type":"ContainerStarted","Data":"cb1016c4e480a9fbf22a95f0eaa22b28c59941cfbdacc4958887219056c1aaba"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.954677 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" event={"ID":"536ab9b3-58fe-47c6-9917-d985d8a986eb","Type":"ContainerStarted","Data":"a6eaa428e1ae8514e5d5cc28702def0a94835a8d1ede5eb2e970db8e75f2b7aa"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.979820 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" event={"ID":"c1fb0642-d2bd-43f7-962a-761f1469df24","Type":"ContainerStarted","Data":"a52e812ffec6a3f5650b651322620bf7301a30ac46663d1ae3af460f08d1fd7e"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.979883 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" event={"ID":"c1fb0642-d2bd-43f7-962a-761f1469df24","Type":"ContainerStarted","Data":"4c362a52a05708a42f61836a193d174705c0ac7d673a1a0e4594f78aa32ae851"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.980588 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.980703 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.480684256 +0000 UTC m=+42.749062281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.980587 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.980835 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:05 crc kubenswrapper[4782]: E0130 18:31:05.981194 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.481179358 +0000 UTC m=+42.749557383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.982393 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" event={"ID":"aa83ad46-833b-4087-a185-83a9dee9a31c","Type":"ContainerStarted","Data":"ffa7f70505ca47de14ab11d775674cbc7ea8589d4e475cc8df6292d990b64422"} Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.990330 4782 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fppfn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 30 18:31:05 crc kubenswrapper[4782]: I0130 18:31:05.990401 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" podUID="c1fb0642-d2bd-43f7-962a-761f1469df24" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:05.993573 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hdvw5" podStartSLOduration=20.993553816 podStartE2EDuration="20.993553816s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:05.992174153 +0000 UTC m=+42.260552188" watchObservedRunningTime="2026-01-30 18:31:05.993553816 +0000 UTC m=+42.261931841" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.023030 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" event={"ID":"995092de-971b-4634-8725-eb2cbc63b926","Type":"ContainerStarted","Data":"f77157fe2d5e1ae53844906dc690cfff13546c1766cb7f2a5a02b17a5c6f0129"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.046096 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" event={"ID":"c326d3fd-0db9-4682-a522-31bc619a2e5a","Type":"ContainerStarted","Data":"8ec7b46693c930c9d2bad01fde909712df6fe3e42ca3a2519c6e1aaf7c9f5c0a"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.048342 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" event={"ID":"6cc01c21-5f0e-4251-a777-a758380c9a4f","Type":"ContainerStarted","Data":"e79c3ef92b81ee15f4b85b3b41860f4cd8ae94048ceeb24ca1605bccc9ab3bf0"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.049123 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" event={"ID":"dd0f947d-ef9a-43ea-a5a0-7fe20d429739","Type":"ContainerStarted","Data":"87812d48a57ef9bdd6790618fa15e64249e4261b546294ec1f79310d51e1d38c"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.049770 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" event={"ID":"699e1980-2c0b-4f99-8977-3ce25d99f142","Type":"ContainerStarted","Data":"cd338ffc20a24302558dbae43fbf0e66bbfeed610c616117a073082e237573b0"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.050403 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" event={"ID":"2fb13904-c3df-4314-a955-cc4be2026b0c","Type":"ContainerStarted","Data":"d7a90edd8ccf828e3f4dd34b24bb5d55e737426498330e527b8452371190f889"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.088019 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.090093 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.590066648 +0000 UTC m=+42.858444673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.097780 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" event={"ID":"115be544-7346-440e-b0bd-04a31a98da70","Type":"ContainerStarted","Data":"04b56f5a1d8bde202359e286fa6533e2933b8c390ea55b593064603f139979f7"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.138342 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" event={"ID":"910b0df0-af99-41ed-8fc2-b4c781979c56","Type":"ContainerStarted","Data":"7909390c5ba76b5a6076199ec2113d587e9b68aaf4d38e8a7491c282a1d00ec6"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.138397 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" event={"ID":"910b0df0-af99-41ed-8fc2-b4c781979c56","Type":"ContainerStarted","Data":"7955ae272f45d4560da27d24689e116c5afb7ce8aa2828ad774b9a76e48bc2b5"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.171503 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" podStartSLOduration=21.171483857 podStartE2EDuration="21.171483857s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.072357232 +0000 UTC m=+42.340735257" watchObservedRunningTime="2026-01-30 18:31:06.171483857 +0000 UTC m=+42.439861882" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.191617 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" event={"ID":"b4c9c21a-67d6-4e7b-af12-671a2a108a57","Type":"ContainerStarted","Data":"5f6e5f4072502c1aca910e339b28f03aa32923803cbb2370e5c6c04cd819fc81"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.193319 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.194643 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.694627944 +0000 UTC m=+42.963005969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.223036 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" event={"ID":"fcdb741e-92ca-479b-a935-9bf1962bf7e5","Type":"ContainerStarted","Data":"193a94d14f55355eab90bb0b80d5b53f5fdef07e8914654e8ea1dfc138f39d33"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.223518 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.238384 4782 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5g8vk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.238447 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" podUID="fcdb741e-92ca-479b-a935-9bf1962bf7e5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.240111 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" podStartSLOduration=21.240101548 podStartE2EDuration="21.240101548s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.173725571 +0000 UTC m=+42.442103596" watchObservedRunningTime="2026-01-30 18:31:06.240101548 +0000 UTC m=+42.508479573" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.267872 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7zh6" event={"ID":"327fdbe8-f465-4ab9-9478-c937cb925ca1","Type":"ContainerStarted","Data":"bff83195a6c612235dd614dbd1a7c7c19793ba2023ff9f2c01124abc63395eba"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.294960 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" podStartSLOduration=21.294937137 podStartE2EDuration="21.294937137s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.292066768 +0000 UTC m=+42.560444793" watchObservedRunningTime="2026-01-30 18:31:06.294937137 +0000 UTC m=+42.563315162" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.295059 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.295734 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqqqd" podStartSLOduration=21.295727906 podStartE2EDuration="21.295727906s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.242602698 +0000 UTC m=+42.510980723" watchObservedRunningTime="2026-01-30 18:31:06.295727906 +0000 UTC m=+42.564105931" Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.296116 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.796092005 +0000 UTC m=+43.064470030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.296337 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-6mr8h" event={"ID":"83fd126e-ad11-4c51-a93f-3a7faefdf653","Type":"ContainerStarted","Data":"40b875b570e2b779a208010654dafb3cbeb67051d6862a0e3a961af9805712b8"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.369042 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" event={"ID":"a2e52cd5-7405-4bc4-b27d-6663543f2c60","Type":"ContainerStarted","Data":"be08ecc921aa7810cba9504dca9ea91cc669104c7aef5a14c64a458b41e18160"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.397527 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.400508 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:06.900483067 +0000 UTC m=+43.168861092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.400818 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zlh6p" event={"ID":"74d4b963-83f0-4327-b969-ec69902f92a6","Type":"ContainerStarted","Data":"086d456dcbdc205bd541baa4de0a3faae956a187269d3f78c9fcab48919e1531"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.451593 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" event={"ID":"89966112-dfc8-4611-8fbc-f18b58dca7e6","Type":"ContainerStarted","Data":"435cc56ad35723121a1b85ca5b04f45b0b3d881206346df48ace323d1bf6f3da"} Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.484053 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zlh6p" podStartSLOduration=8.484029437 podStartE2EDuration="8.484029437s" podCreationTimestamp="2026-01-30 18:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.474854086 +0000 UTC m=+42.743232121" watchObservedRunningTime="2026-01-30 18:31:06.484029437 +0000 UTC m=+42.752407462" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.484835 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-297fm" podStartSLOduration=21.484829487 podStartE2EDuration="21.484829487s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.360656679 +0000 UTC m=+42.629034704" watchObservedRunningTime="2026-01-30 18:31:06.484829487 +0000 UTC m=+42.753207512" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.504902 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.506701 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.006677942 +0000 UTC m=+43.275055967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.629256 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.631609 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.131586808 +0000 UTC m=+43.399965013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.661960 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.733555 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.734065 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.234042683 +0000 UTC m=+43.502420708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.735207 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lptd9" podStartSLOduration=21.73518401 podStartE2EDuration="21.73518401s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:06.529735647 +0000 UTC m=+42.798113672" watchObservedRunningTime="2026-01-30 18:31:06.73518401 +0000 UTC m=+43.003562035" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.817929 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:06 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:06 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:06 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.818001 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.833763 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rbvgr" Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.834780 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.835165 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.335147956 +0000 UTC m=+43.603525981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:06 crc kubenswrapper[4782]: I0130 18:31:06.936136 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:06 crc kubenswrapper[4782]: E0130 18:31:06.937843 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.437817586 +0000 UTC m=+43.706195611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.039907 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.059386 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.559359621 +0000 UTC m=+43.827737646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.141925 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.142256 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.642189554 +0000 UTC m=+43.910567599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.142412 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.142798 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.642780908 +0000 UTC m=+43.911158933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.216170 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5xjfh"] Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.243422 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.243662 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.743625234 +0000 UTC m=+44.012003259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.243801 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.244194 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.744185858 +0000 UTC m=+44.012563883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.345524 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.345760 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.845730821 +0000 UTC m=+44.114108846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.345866 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.346246 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.846213613 +0000 UTC m=+44.114591638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.446634 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.446848 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.946817603 +0000 UTC m=+44.215195638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.447313 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.447718 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:07.947706905 +0000 UTC m=+44.216084930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.448705 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" event={"ID":"a2e52cd5-7405-4bc4-b27d-6663543f2c60","Type":"ContainerStarted","Data":"9086742a3fa92448ca77727408418b9f9d4e4abc1d2f07d3fb18dd881e0ff1c0"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.448764 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" event={"ID":"a2e52cd5-7405-4bc4-b27d-6663543f2c60","Type":"ContainerStarted","Data":"a71b6bf41f70e8d2e88ab16ed8161244d70d0b33fb5ae11f80ae958e3e07295d"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.451693 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" event={"ID":"6cc01c21-5f0e-4251-a777-a758380c9a4f","Type":"ContainerStarted","Data":"b368f1eeeb7bafc928c6656b972e1c8ec34688a2312c2d66362319f88e5b5d44"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.453472 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" event={"ID":"dd0f947d-ef9a-43ea-a5a0-7fe20d429739","Type":"ContainerStarted","Data":"7693321f4038dba969fde835d2a72d6b3137e9f83734c90176ecdeccd8537e99"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.455113 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7zh6" event={"ID":"327fdbe8-f465-4ab9-9478-c937cb925ca1","Type":"ContainerStarted","Data":"49fc04654a857c8a9916c0cc5b5352cb1b0b24ca9c6762cd74b13d2c9068847b"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.458434 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" event={"ID":"50209ca4-3459-4af3-8f4c-a377c053e65f","Type":"ContainerStarted","Data":"a20c8f884009751c1c71667521debf381d0185b2c535562447f0625d6cfcb192"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.460310 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerStarted","Data":"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.460547 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.462261 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" event={"ID":"5b76d69f-7ca3-4cb0-a795-b106479a0b50","Type":"ContainerStarted","Data":"d16c843dc6f42245f264b2e3455d3fa881188ea8f6b244cb58adebd0f5db0fba"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.463644 4782 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-46dsj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.463716 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.464434 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" event={"ID":"0a981a6c-a999-4661-89fc-4c1e04c6fcac","Type":"ContainerStarted","Data":"e43632379ad2de36365c2c1bd97816d59e62447941048ea962a56ee50dc82326"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.466499 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" event={"ID":"aa83ad46-833b-4087-a185-83a9dee9a31c","Type":"ContainerStarted","Data":"374926cbbc3fbd8059240c4bb1383a10a5ce8155af5e6cf49d266d8ae984f4a5"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.466549 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" event={"ID":"aa83ad46-833b-4087-a185-83a9dee9a31c","Type":"ContainerStarted","Data":"c13d02b5773eba59d9ebb1e896eb25d74a47fa8d358b5da0cc0e105984d7c7e1"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.469065 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9w2tq" event={"ID":"2f8b3777-ad72-4e02-b6ac-3664c32415b5","Type":"ContainerStarted","Data":"273311a2dc5ce337edcc8d1505afdfd787c4123c84ea9e9ac035863016f3ef99"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.469109 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9w2tq" event={"ID":"2f8b3777-ad72-4e02-b6ac-3664c32415b5","Type":"ContainerStarted","Data":"f30715ccfa7e53a8ad9fdb8a919126f3d26a7322f3195127910dc1a6e71fabda"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.470651 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" event={"ID":"699e1980-2c0b-4f99-8977-3ce25d99f142","Type":"ContainerStarted","Data":"e013d0bfd822f7553ec9d48d35b58477e4484680b2711ed884d6e4ddf3ff94c2"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.472522 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" event={"ID":"2fb13904-c3df-4314-a955-cc4be2026b0c","Type":"ContainerStarted","Data":"35c9399438a41a31e064e4c894555e2a636069a77fff7eaeb061e398b0614ab6"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.473777 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" event={"ID":"eb668ab0-eba2-4a3e-97fe-74703c1bb2cc","Type":"ContainerStarted","Data":"00c717cac6f62da14e70f8e671e5792f1eebcd9874fe4b492af3b8f438dbc3dc"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.476816 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" event={"ID":"c326d3fd-0db9-4682-a522-31bc619a2e5a","Type":"ContainerStarted","Data":"c172af62299929a62d7f10df9e46aef0d02c943d9370d5d39bf868f47ef4c3a1"} Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.477527 4782 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5g8vk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.477590 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" podUID="fcdb741e-92ca-479b-a935-9bf1962bf7e5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.477670 4782 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fppfn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.477720 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" podUID="c1fb0642-d2bd-43f7-962a-761f1469df24" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.485408 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" podStartSLOduration=22.48538551 podStartE2EDuration="22.48538551s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.483554556 +0000 UTC m=+43.751932591" watchObservedRunningTime="2026-01-30 18:31:07.48538551 +0000 UTC m=+43.753763535" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.502721 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" podStartSLOduration=22.502697947 podStartE2EDuration="22.502697947s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.502638356 +0000 UTC m=+43.771016371" watchObservedRunningTime="2026-01-30 18:31:07.502697947 +0000 UTC m=+43.771075972" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.519877 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hfj2v" podStartSLOduration=22.5198566 podStartE2EDuration="22.5198566s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.516049228 +0000 UTC m=+43.784427253" watchObservedRunningTime="2026-01-30 18:31:07.5198566 +0000 UTC m=+43.788234625" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.542587 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-df42d" podStartSLOduration=22.542565026 podStartE2EDuration="22.542565026s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.540529067 +0000 UTC m=+43.808907092" watchObservedRunningTime="2026-01-30 18:31:07.542565026 +0000 UTC m=+43.810943051" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.548318 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.548570 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.0485446 +0000 UTC m=+44.316922625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.553021 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.554105 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.054091094 +0000 UTC m=+44.322469119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.569347 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-j25hm" podStartSLOduration=22.56932712 podStartE2EDuration="22.56932712s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.567906966 +0000 UTC m=+43.836284991" watchObservedRunningTime="2026-01-30 18:31:07.56932712 +0000 UTC m=+43.837705145" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.595561 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" podStartSLOduration=22.595535021 podStartE2EDuration="22.595535021s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.591092834 +0000 UTC m=+43.859470859" watchObservedRunningTime="2026-01-30 18:31:07.595535021 +0000 UTC m=+43.863913046" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.605664 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-h4vm5" podStartSLOduration=22.605643444000002 podStartE2EDuration="22.605643444s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.604609339 +0000 UTC m=+43.872987364" watchObservedRunningTime="2026-01-30 18:31:07.605643444 +0000 UTC m=+43.874021469" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.624713 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-k8wwb" podStartSLOduration=22.624695282 podStartE2EDuration="22.624695282s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.62253394 +0000 UTC m=+43.890911965" watchObservedRunningTime="2026-01-30 18:31:07.624695282 +0000 UTC m=+43.893073307" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.653304 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nsxlx" podStartSLOduration=22.65328676 podStartE2EDuration="22.65328676s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.6524356 +0000 UTC m=+43.920813625" watchObservedRunningTime="2026-01-30 18:31:07.65328676 +0000 UTC m=+43.921664785" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.657440 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.657676 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.157645885 +0000 UTC m=+44.426023910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.658405 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.659549 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.159539651 +0000 UTC m=+44.427917676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.679121 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hn494" podStartSLOduration=22.679101222 podStartE2EDuration="22.679101222s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.678862906 +0000 UTC m=+43.947240931" watchObservedRunningTime="2026-01-30 18:31:07.679101222 +0000 UTC m=+43.947479247" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.701145 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mvxx7" podStartSLOduration=22.701129302 podStartE2EDuration="22.701129302s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.699498282 +0000 UTC m=+43.967876307" watchObservedRunningTime="2026-01-30 18:31:07.701129302 +0000 UTC m=+43.969507327" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.760496 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.760741 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.260705275 +0000 UTC m=+44.529083300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.761357 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.762078 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.262061908 +0000 UTC m=+44.530439933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.810545 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:07 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:07 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:07 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.810618 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.862303 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.862451 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.362427243 +0000 UTC m=+44.630805268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.862681 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.863035 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.363027507 +0000 UTC m=+44.631405532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.964199 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.964731 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.464681833 +0000 UTC m=+44.733059858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:07 crc kubenswrapper[4782]: I0130 18:31:07.964957 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:07 crc kubenswrapper[4782]: E0130 18:31:07.965468 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.465452722 +0000 UTC m=+44.733830747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.065898 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.066112 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.566082483 +0000 UTC m=+44.834460508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.066436 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.066884 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.566874452 +0000 UTC m=+44.835252477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.167209 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.167426 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.66738412 +0000 UTC m=+44.935762145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.167605 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.168009 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.667994035 +0000 UTC m=+44.936372060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.269326 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.269708 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.769672962 +0000 UTC m=+45.038050987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.269847 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.270194 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.770181614 +0000 UTC m=+45.038559639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.370858 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.371070 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.8710064 +0000 UTC m=+45.139384425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.371133 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.371554 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.871546063 +0000 UTC m=+45.139924088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.472082 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.472742 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:08.972708947 +0000 UTC m=+45.241087002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.485435 4782 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-46dsj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.485490 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.486247 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7zh6" event={"ID":"327fdbe8-f465-4ab9-9478-c937cb925ca1","Type":"ContainerStarted","Data":"e8f55173671c6b30990909a7c2dac5717d28f0887189564d4ad368c6c40f0f25"} Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.486374 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" gracePeriod=30 Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.491362 4782 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fppfn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.491421 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" podUID="c1fb0642-d2bd-43f7-962a-761f1469df24" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.516096 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" podStartSLOduration=23.51607483 podStartE2EDuration="23.51607483s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:07.729776881 +0000 UTC m=+43.998154916" watchObservedRunningTime="2026-01-30 18:31:08.51607483 +0000 UTC m=+44.784452855" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.517866 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d7zh6" podStartSLOduration=23.517858413 podStartE2EDuration="23.517858413s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:08.515394574 +0000 UTC m=+44.783772609" watchObservedRunningTime="2026-01-30 18:31:08.517858413 +0000 UTC m=+44.786236438" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.541949 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" podStartSLOduration=23.541923992 podStartE2EDuration="23.541923992s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:08.539930704 +0000 UTC m=+44.808308739" watchObservedRunningTime="2026-01-30 18:31:08.541923992 +0000 UTC m=+44.810302017" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.559502 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9w2tq" podStartSLOduration=9.559480505 podStartE2EDuration="9.559480505s" podCreationTimestamp="2026-01-30 18:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:08.557967458 +0000 UTC m=+44.826345483" watchObservedRunningTime="2026-01-30 18:31:08.559480505 +0000 UTC m=+44.827858540" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.573638 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.576195 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.076178937 +0000 UTC m=+45.344556962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.681954 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.682219 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.182180157 +0000 UTC m=+45.450558182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.682721 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.683157 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.18313858 +0000 UTC m=+45.451516605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.783602 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.783908 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.283857724 +0000 UTC m=+45.552235749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.784247 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.784860 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.284832617 +0000 UTC m=+45.553210642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.814429 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:08 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:08 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:08 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.814550 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.886153 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.886368 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.386326349 +0000 UTC m=+45.654704374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.886971 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.887438 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.387427396 +0000 UTC m=+45.655805421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.988215 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.988325 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.488307363 +0000 UTC m=+45.756685388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:08 crc kubenswrapper[4782]: I0130 18:31:08.988573 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:08 crc kubenswrapper[4782]: E0130 18:31:08.988884 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.488875587 +0000 UTC m=+45.757253612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.089529 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.089738 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.589704153 +0000 UTC m=+45.858082188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.090401 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.090904 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.590886551 +0000 UTC m=+45.859264576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.192043 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.192267 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.69221923 +0000 UTC m=+45.960597255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.192723 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.193051 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.693035919 +0000 UTC m=+45.961413944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.293891 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.294097 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.79406668 +0000 UTC m=+46.062444705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.294247 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.294564 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.794551242 +0000 UTC m=+46.062929267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.395039 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.395253 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.895213354 +0000 UTC m=+46.163591379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.395769 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.396168 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.896158687 +0000 UTC m=+46.164536712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.491084 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" event={"ID":"5b76d69f-7ca3-4cb0-a795-b106479a0b50","Type":"ContainerStarted","Data":"339f8853351a4d3664125c5ddc130b2daf1cc1401ecd746d997fa0627d083435"} Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.497177 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.497398 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.997372972 +0000 UTC m=+46.265750987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.497512 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.497984 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:09.997975847 +0000 UTC m=+46.266353862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.598813 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.599078 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.099047179 +0000 UTC m=+46.367425204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.599561 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.600059 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.100039412 +0000 UTC m=+46.368417437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.614194 4782 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.701394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.701669 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.201627257 +0000 UTC m=+46.470005282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.701723 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.702124 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.202115798 +0000 UTC m=+46.470493823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.802767 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.802994 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.302954925 +0000 UTC m=+46.571332950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.803065 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.803742 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.303719713 +0000 UTC m=+46.572097738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.811179 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:09 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:09 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:09 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.811265 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.905028 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.905194 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.405163344 +0000 UTC m=+46.673541369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:09 crc kubenswrapper[4782]: I0130 18:31:09.905389 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:09 crc kubenswrapper[4782]: E0130 18:31:09.905746 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.405731388 +0000 UTC m=+46.674109413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.006515 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.006792 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.506754729 +0000 UTC m=+46.775132754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.007099 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.007571 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.507560908 +0000 UTC m=+46.775938933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.108508 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.108978 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.608957008 +0000 UTC m=+46.877335033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.210842 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.211441 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.711416103 +0000 UTC m=+46.979794128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.213546 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.214582 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.219762 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.232101 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.312716 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.312991 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.812950286 +0000 UTC m=+47.081328311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.313095 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.313328 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.313475 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.313486 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.813470019 +0000 UTC m=+47.081848044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.313529 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj898\" (UniqueName: \"kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.415030 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.415365 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.91533062 +0000 UTC m=+47.183708635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.415495 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.415665 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.415756 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj898\" (UniqueName: \"kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.415873 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.416185 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.416202 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.416383 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:10.916369765 +0000 UTC m=+47.184747790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.418494 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.419789 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.421756 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.432941 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.460096 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj898\" (UniqueName: \"kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898\") pod \"certified-operators-ddhqf\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.498460 4782 generic.go:334] "Generic (PLEG): container finished" podID="699e1980-2c0b-4f99-8977-3ce25d99f142" containerID="e013d0bfd822f7553ec9d48d35b58477e4484680b2711ed884d6e4ddf3ff94c2" exitCode=0 Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.498544 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" event={"ID":"699e1980-2c0b-4f99-8977-3ce25d99f142","Type":"ContainerDied","Data":"e013d0bfd822f7553ec9d48d35b58477e4484680b2711ed884d6e4ddf3ff94c2"} Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.501491 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" event={"ID":"5b76d69f-7ca3-4cb0-a795-b106479a0b50","Type":"ContainerStarted","Data":"6ecbaedfce0bce6dfde933ae78b3f5172cfc7604005bfacc02db94922c4e3c31"} Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.501534 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" event={"ID":"5b76d69f-7ca3-4cb0-a795-b106479a0b50","Type":"ContainerStarted","Data":"eb584ca28bb5f340d8e3e6c08944423aec84012a107df67b9d15e7bfa726beba"} Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.502488 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.503080 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.505698 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.506283 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517358 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517401 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517551 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.517576 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 18:31:11.017545809 +0000 UTC m=+47.285923834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517609 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zwk\" (UniqueName: \"kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517644 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.517703 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: E0130 18:31:10.518015 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 18:31:11.01800251 +0000 UTC m=+47.286380535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qjchv" (UID: "34770880-dc82-40ff-9989-bbe06f230233") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.527494 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.530606 4782 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T18:31:09.614459269Z","Handler":null,"Name":""} Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.558927 4782 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.558970 4782 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.616557 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2zmw5" podStartSLOduration=12.616534411 podStartE2EDuration="12.616534411s" podCreationTimestamp="2026-01-30 18:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:10.612701259 +0000 UTC m=+46.881079284" watchObservedRunningTime="2026-01-30 18:31:10.616534411 +0000 UTC m=+46.884912436" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618549 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618800 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618839 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618862 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618881 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zwk\" (UniqueName: \"kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.618907 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.619903 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.619972 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.622837 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.624058 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.624324 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.641870 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.649908 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zwk\" (UniqueName: \"kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk\") pod \"community-operators-qg5cd\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720464 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720524 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxb47\" (UniqueName: \"kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720567 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720626 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720661 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720684 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.720767 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.733033 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.744880 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.791427 4782 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.791803 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.812533 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.812567 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.818422 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.821949 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822002 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxb47\" (UniqueName: \"kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822034 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822077 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822489 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:10 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:10 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:10 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822551 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.822789 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.823109 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.832649 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.837055 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.852399 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.862913 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.868075 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxb47\" (UniqueName: \"kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47\") pod \"certified-operators-94k6x\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.944154 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qjchv\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.944552 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvrjg\" (UniqueName: \"kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.944635 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.944650 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.970257 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.970316 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.973082 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.987055 4782 patch_prober.go:28] interesting pod/console-f9d7485db-hcttm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 30 18:31:10 crc kubenswrapper[4782]: I0130 18:31:10.987127 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hcttm" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.002089 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.002306 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.035566 4782 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fq5gm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]log ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]etcd ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/generic-apiserver-start-informers ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/max-in-flight-filter ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 30 18:31:11 crc kubenswrapper[4782]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 30 18:31:11 crc kubenswrapper[4782]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/project.openshift.io-projectcache ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-startinformers ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 30 18:31:11 crc kubenswrapper[4782]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 30 18:31:11 crc kubenswrapper[4782]: livez check failed Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.035637 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" podUID="6cc01c21-5f0e-4251-a777-a758380c9a4f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.037512 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.047096 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvrjg\" (UniqueName: \"kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.047184 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.047210 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.049520 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.049843 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.066359 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvrjg\" (UniqueName: \"kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg\") pod \"community-operators-5bldf\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.121634 4782 patch_prober.go:28] interesting pod/downloads-7954f5f757-4dqmv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.121658 4782 patch_prober.go:28] interesting pod/downloads-7954f5f757-4dqmv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.121713 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4dqmv" podUID="328b2a40-069b-4eda-b7ae-38f62b5a192a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.121729 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4dqmv" podUID="328b2a40-069b-4eda-b7ae-38f62b5a192a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.143998 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.196527 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.261935 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.287925 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.355880 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.366554 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.463081 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.521612 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerStarted","Data":"6726cb013c2eb79bf5e7a65b364ba38d3a531043adc006afadb9d079cfc610c6"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.524444 4782 generic.go:334] "Generic (PLEG): container finished" podID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerID="60da5b1aad0690940671038f93bcbcd97b3abdcedcf1391c5e3a5d9bbc77f064" exitCode=0 Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.524568 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerDied","Data":"60da5b1aad0690940671038f93bcbcd97b3abdcedcf1391c5e3a5d9bbc77f064"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.524623 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerStarted","Data":"c8b8b1c1a70b9bdbbab003fb41fcd931ea762e883bbdadcf416d2fbdae24d833"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.525977 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" event={"ID":"34770880-dc82-40ff-9989-bbe06f230233","Type":"ContainerStarted","Data":"86499724aff3426fb2deab2c7c2255fc236a71684f1bd9f0f9a7d189195f231e"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.528023 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.533970 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1","Type":"ContainerStarted","Data":"ec3713628237d3c0b1c8da319d4257ca74253ebcacfdbffe4f282b9410c112a8"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.539890 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerStarted","Data":"042d59e0f4d6c7ea46dc1f11088918ba92cd654fabafbadef94e8fe20a503b49"} Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.546998 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bl25w" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.688042 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.807436 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.812142 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:11 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:11 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:11 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.812219 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:11 crc kubenswrapper[4782]: I0130 18:31:11.932848 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.074378 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume\") pod \"699e1980-2c0b-4f99-8977-3ce25d99f142\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.074471 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume\") pod \"699e1980-2c0b-4f99-8977-3ce25d99f142\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.074517 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kvsb\" (UniqueName: \"kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb\") pod \"699e1980-2c0b-4f99-8977-3ce25d99f142\" (UID: \"699e1980-2c0b-4f99-8977-3ce25d99f142\") " Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.075141 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume" (OuterVolumeSpecName: "config-volume") pod "699e1980-2c0b-4f99-8977-3ce25d99f142" (UID: "699e1980-2c0b-4f99-8977-3ce25d99f142"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.114436 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb" (OuterVolumeSpecName: "kube-api-access-7kvsb") pod "699e1980-2c0b-4f99-8977-3ce25d99f142" (UID: "699e1980-2c0b-4f99-8977-3ce25d99f142"). InnerVolumeSpecName "kube-api-access-7kvsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.118272 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "699e1980-2c0b-4f99-8977-3ce25d99f142" (UID: "699e1980-2c0b-4f99-8977-3ce25d99f142"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.176320 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/699e1980-2c0b-4f99-8977-3ce25d99f142-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.176353 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/699e1980-2c0b-4f99-8977-3ce25d99f142-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.176362 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kvsb\" (UniqueName: \"kubernetes.io/projected/699e1980-2c0b-4f99-8977-3ce25d99f142-kube-api-access-7kvsb\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.194987 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.206904 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fppfn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.208530 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:31:12 crc kubenswrapper[4782]: E0130 18:31:12.208801 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699e1980-2c0b-4f99-8977-3ce25d99f142" containerName="collect-profiles" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.208837 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="699e1980-2c0b-4f99-8977-3ce25d99f142" containerName="collect-profiles" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.208943 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="699e1980-2c0b-4f99-8977-3ce25d99f142" containerName="collect-profiles" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.209881 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.212027 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5g8vk" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.212674 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.221333 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.221693 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fwm2h" Jan 30 18:31:12 crc kubenswrapper[4782]: E0130 18:31:12.240350 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:12 crc kubenswrapper[4782]: E0130 18:31:12.245667 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:12 crc kubenswrapper[4782]: E0130 18:31:12.249420 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:12 crc kubenswrapper[4782]: E0130 18:31:12.249507 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.318103 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.380457 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.381036 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.381104 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmgng\" (UniqueName: \"kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.419574 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.482965 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.483081 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmgng\" (UniqueName: \"kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.483136 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.484844 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.485164 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.487768 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.517216 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmgng\" (UniqueName: \"kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng\") pod \"redhat-marketplace-bnbhn\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.548381 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.570744 4782 generic.go:334] "Generic (PLEG): container finished" podID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerID="1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d" exitCode=0 Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.570883 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerDied","Data":"1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.582451 4782 generic.go:334] "Generic (PLEG): container finished" podID="30838f0d-efae-4fdf-b098-14537a312bb3" containerID="f2d05d5832a56495d7c69b403ab23770f6ea2b72996fdc0eaa85875afe77a1be" exitCode=0 Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.582544 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerDied","Data":"f2d05d5832a56495d7c69b403ab23770f6ea2b72996fdc0eaa85875afe77a1be"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.582597 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerStarted","Data":"391d73754a6681003005b71b363f0a7f14ce9ef503c7c8434fe6af1c979f7ec8"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.591651 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1","Type":"ContainerStarted","Data":"36b0d6b3ec62d8434715ee0e6187c8425896a991aa8b10d3eb0d5d81f8d5c46c"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.600182 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" event={"ID":"34770880-dc82-40ff-9989-bbe06f230233","Type":"ContainerStarted","Data":"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.600494 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.618100 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.618198 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.618180233 podStartE2EDuration="2.618180233s" podCreationTimestamp="2026-01-30 18:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:12.615344604 +0000 UTC m=+48.883722639" watchObservedRunningTime="2026-01-30 18:31:12.618180233 +0000 UTC m=+48.886558268" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.619687 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.628117 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.632498 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" event={"ID":"699e1980-2c0b-4f99-8977-3ce25d99f142","Type":"ContainerDied","Data":"cd338ffc20a24302558dbae43fbf0e66bbfeed610c616117a073082e237573b0"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.632542 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd338ffc20a24302558dbae43fbf0e66bbfeed610c616117a073082e237573b0" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.632562 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.635054 4782 generic.go:334] "Generic (PLEG): container finished" podID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerID="3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80" exitCode=0 Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.635165 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerDied","Data":"3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80"} Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.666921 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" podStartSLOduration=27.666904705 podStartE2EDuration="27.666904705s" podCreationTimestamp="2026-01-30 18:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:12.664892917 +0000 UTC m=+48.933270942" watchObservedRunningTime="2026-01-30 18:31:12.666904705 +0000 UTC m=+48.935282730" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.787480 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhkz\" (UniqueName: \"kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.788014 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.788251 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.812288 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:12 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:12 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:12 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.812364 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.834971 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:31:12 crc kubenswrapper[4782]: W0130 18:31:12.851929 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913d2663_2aea_4ac0_98bc_eb817aee0f98.slice/crio-1367e56f51236d78f50014cbe79e12ab7077ff234a6c63cbe9c92d6d1fe85d17 WatchSource:0}: Error finding container 1367e56f51236d78f50014cbe79e12ab7077ff234a6c63cbe9c92d6d1fe85d17: Status 404 returned error can't find the container with id 1367e56f51236d78f50014cbe79e12ab7077ff234a6c63cbe9c92d6d1fe85d17 Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.885436 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.886469 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.892147 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.893398 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.893464 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhkz\" (UniqueName: \"kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.893507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.894068 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.894502 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.894722 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.898441 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.927939 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhkz\" (UniqueName: \"kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz\") pod \"redhat-marketplace-vg9pw\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.989581 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.995497 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:12 crc kubenswrapper[4782]: I0130 18:31:12.995553 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.096883 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.097067 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.098492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.124272 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.202819 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.223592 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:13 crc kubenswrapper[4782]: W0130 18:31:13.225416 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ded13e_75b7_4dc5_9b72_631f07ab52da.slice/crio-4a5badd8a31475a3e2f56bc49daf0ee0568a3b69e6ef58b18b082bbd0d67f7eb WatchSource:0}: Error finding container 4a5badd8a31475a3e2f56bc49daf0ee0568a3b69e6ef58b18b082bbd0d67f7eb: Status 404 returned error can't find the container with id 4a5badd8a31475a3e2f56bc49daf0ee0568a3b69e6ef58b18b082bbd0d67f7eb Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.409275 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.410949 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.414222 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.420904 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.607431 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.608374 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.608464 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bhm\" (UniqueName: \"kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.645746 4782 generic.go:334] "Generic (PLEG): container finished" podID="2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" containerID="36b0d6b3ec62d8434715ee0e6187c8425896a991aa8b10d3eb0d5d81f8d5c46c" exitCode=0 Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.645831 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1","Type":"ContainerDied","Data":"36b0d6b3ec62d8434715ee0e6187c8425896a991aa8b10d3eb0d5d81f8d5c46c"} Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.648454 4782 generic.go:334] "Generic (PLEG): container finished" podID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerID="bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2" exitCode=0 Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.648527 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerDied","Data":"bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2"} Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.648640 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerStarted","Data":"1367e56f51236d78f50014cbe79e12ab7077ff234a6c63cbe9c92d6d1fe85d17"} Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.650736 4782 generic.go:334] "Generic (PLEG): container finished" podID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerID="f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415" exitCode=0 Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.650801 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerDied","Data":"f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415"} Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.650830 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerStarted","Data":"4a5badd8a31475a3e2f56bc49daf0ee0568a3b69e6ef58b18b082bbd0d67f7eb"} Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.710280 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bhm\" (UniqueName: \"kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.710789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.710879 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.711426 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.711453 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.730199 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.733959 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bhm\" (UniqueName: \"kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm\") pod \"redhat-operators-l4jgx\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.807766 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.809426 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.811328 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:13 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:13 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:13 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.811378 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.836310 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.913570 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.914135 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qpv\" (UniqueName: \"kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:13 crc kubenswrapper[4782]: I0130 18:31:13.914178 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.016113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.015469 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.016299 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qpv\" (UniqueName: \"kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.018770 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.016831 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.032857 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.036510 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qpv\" (UniqueName: \"kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv\") pod \"redhat-operators-jngb2\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.135469 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.276131 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:31:14 crc kubenswrapper[4782]: W0130 18:31:14.317364 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eeba928_9384_4789_b6d2_dbc557b815d5.slice/crio-ea6f3270a846f64f98c9041c3bafb76ae9a174c49e52557d916afb1636721dec WatchSource:0}: Error finding container ea6f3270a846f64f98c9041c3bafb76ae9a174c49e52557d916afb1636721dec: Status 404 returned error can't find the container with id ea6f3270a846f64f98c9041c3bafb76ae9a174c49e52557d916afb1636721dec Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.658745 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154","Type":"ContainerStarted","Data":"41130d1feb70c31cfbcb21e3991697fc300b0ac7661db528445b0e89c9965a5a"} Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.659317 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154","Type":"ContainerStarted","Data":"a9ba9976f13e46c857e53b436998e17d866c25631b3eedbf86541423c00b725b"} Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.661640 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerStarted","Data":"ea6f3270a846f64f98c9041c3bafb76ae9a174c49e52557d916afb1636721dec"} Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.755590 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:31:14 crc kubenswrapper[4782]: W0130 18:31:14.792509 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93582a18_a665_4de9_b213_bae40598079d.slice/crio-22a68c0e6eec6f2800c0b3ecf62ed44fc561c88638f8eafa589acea675859e8c WatchSource:0}: Error finding container 22a68c0e6eec6f2800c0b3ecf62ed44fc561c88638f8eafa589acea675859e8c: Status 404 returned error can't find the container with id 22a68c0e6eec6f2800c0b3ecf62ed44fc561c88638f8eafa589acea675859e8c Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.810567 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:14 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:14 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:14 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.810677 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:14 crc kubenswrapper[4782]: I0130 18:31:14.952996 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.039115 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access\") pod \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.039250 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir\") pod \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\" (UID: \"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1\") " Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.039391 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" (UID: "2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.040362 4782 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.046985 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" (UID: "2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.125539 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.142302 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.153856 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.673766 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1","Type":"ContainerDied","Data":"ec3713628237d3c0b1c8da319d4257ca74253ebcacfdbffe4f282b9410c112a8"} Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.674126 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3713628237d3c0b1c8da319d4257ca74253ebcacfdbffe4f282b9410c112a8" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.673833 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.676351 4782 generic.go:334] "Generic (PLEG): container finished" podID="93582a18-a665-4de9-b213-bae40598079d" containerID="0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d" exitCode=0 Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.676435 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerDied","Data":"0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d"} Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.676509 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerStarted","Data":"22a68c0e6eec6f2800c0b3ecf62ed44fc561c88638f8eafa589acea675859e8c"} Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.680249 4782 generic.go:334] "Generic (PLEG): container finished" podID="7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" containerID="41130d1feb70c31cfbcb21e3991697fc300b0ac7661db528445b0e89c9965a5a" exitCode=0 Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.680319 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154","Type":"ContainerDied","Data":"41130d1feb70c31cfbcb21e3991697fc300b0ac7661db528445b0e89c9965a5a"} Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.682838 4782 generic.go:334] "Generic (PLEG): container finished" podID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerID="09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529" exitCode=0 Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.682907 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerDied","Data":"09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529"} Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.699090 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.699068953 podStartE2EDuration="699.068953ms" podCreationTimestamp="2026-01-30 18:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:15.698504699 +0000 UTC m=+51.966882724" watchObservedRunningTime="2026-01-30 18:31:15.699068953 +0000 UTC m=+51.967446978" Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.810595 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:15 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:15 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:15 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:15 crc kubenswrapper[4782]: I0130 18:31:15.810671 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:16 crc kubenswrapper[4782]: I0130 18:31:16.004553 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:16 crc kubenswrapper[4782]: I0130 18:31:16.009960 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fq5gm" Jan 30 18:31:16 crc kubenswrapper[4782]: I0130 18:31:16.825492 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:16 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:16 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:16 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:16 crc kubenswrapper[4782]: I0130 18:31:16.825573 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.264678 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9w2tq" Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.810861 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:17 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:17 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:17 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.810934 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.907309 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.907561 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:31:17 crc kubenswrapper[4782]: I0130 18:31:17.961457 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.603618 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.712130 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154","Type":"ContainerDied","Data":"a9ba9976f13e46c857e53b436998e17d866c25631b3eedbf86541423c00b725b"} Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.712199 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ba9976f13e46c857e53b436998e17d866c25631b3eedbf86541423c00b725b" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.712155 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.727908 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir\") pod \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.727992 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" (UID: "7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.728032 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access\") pod \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\" (UID: \"7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154\") " Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.729062 4782 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.739084 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" (UID: "7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.810355 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:18 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:18 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:18 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.810437 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:18 crc kubenswrapper[4782]: I0130 18:31:18.831072 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:19 crc kubenswrapper[4782]: I0130 18:31:19.809718 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:19 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:19 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:19 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:19 crc kubenswrapper[4782]: I0130 18:31:19.809797 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:20 crc kubenswrapper[4782]: I0130 18:31:20.810122 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:20 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:20 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:20 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:20 crc kubenswrapper[4782]: I0130 18:31:20.810807 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:20 crc kubenswrapper[4782]: I0130 18:31:20.964729 4782 patch_prober.go:28] interesting pod/console-f9d7485db-hcttm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 30 18:31:20 crc kubenswrapper[4782]: I0130 18:31:20.964808 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hcttm" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 30 18:31:21 crc kubenswrapper[4782]: I0130 18:31:21.128981 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4dqmv" Jan 30 18:31:21 crc kubenswrapper[4782]: I0130 18:31:21.809781 4782 patch_prober.go:28] interesting pod/router-default-5444994796-k7vd9 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 18:31:21 crc kubenswrapper[4782]: [-]has-synced failed: reason withheld Jan 30 18:31:21 crc kubenswrapper[4782]: [+]process-running ok Jan 30 18:31:21 crc kubenswrapper[4782]: healthz check failed Jan 30 18:31:21 crc kubenswrapper[4782]: I0130 18:31:21.809885 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7vd9" podUID="7e6c4d92-bd0b-48c3-92b8-d8bb236ee8aa" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 18:31:22 crc kubenswrapper[4782]: E0130 18:31:22.237595 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:22 crc kubenswrapper[4782]: E0130 18:31:22.239960 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:22 crc kubenswrapper[4782]: E0130 18:31:22.241628 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:22 crc kubenswrapper[4782]: E0130 18:31:22.241735 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:22 crc kubenswrapper[4782]: I0130 18:31:22.810957 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:22 crc kubenswrapper[4782]: I0130 18:31:22.813067 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-k7vd9" Jan 30 18:31:30 crc kubenswrapper[4782]: I0130 18:31:30.967749 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:30 crc kubenswrapper[4782]: I0130 18:31:30.975848 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:31:31 crc kubenswrapper[4782]: I0130 18:31:31.152043 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:31:32 crc kubenswrapper[4782]: E0130 18:31:32.236288 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:32 crc kubenswrapper[4782]: E0130 18:31:32.238693 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:32 crc kubenswrapper[4782]: E0130 18:31:32.239810 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:32 crc kubenswrapper[4782]: E0130 18:31:32.239860 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:34 crc kubenswrapper[4782]: I0130 18:31:34.428676 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 18:31:34 crc kubenswrapper[4782]: E0130 18:31:34.914836 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 30 18:31:34 crc kubenswrapper[4782]: E0130 18:31:34.915388 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9zwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qg5cd_openshift-marketplace(0e13d1a3-9ea0-470c-8e34-c935718e7fcf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:34 crc kubenswrapper[4782]: E0130 18:31:34.917268 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qg5cd" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" Jan 30 18:31:35 crc kubenswrapper[4782]: I0130 18:31:35.852168 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.8521098999999999 podStartE2EDuration="1.8521099s" podCreationTimestamp="2026-01-30 18:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:35.848439149 +0000 UTC m=+72.116817174" watchObservedRunningTime="2026-01-30 18:31:35.8521099 +0000 UTC m=+72.120487925" Jan 30 18:31:39 crc kubenswrapper[4782]: E0130 18:31:39.643729 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qg5cd" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" Jan 30 18:31:39 crc kubenswrapper[4782]: I0130 18:31:39.836123 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5xjfh_343fced1-cf4b-4d48-9a25-df17b608e09e/kube-multus-additional-cni-plugins/0.log" Jan 30 18:31:39 crc kubenswrapper[4782]: I0130 18:31:39.836200 4782 generic.go:334] "Generic (PLEG): container finished" podID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" exitCode=137 Jan 30 18:31:39 crc kubenswrapper[4782]: I0130 18:31:39.836277 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" event={"ID":"343fced1-cf4b-4d48-9a25-df17b608e09e","Type":"ContainerDied","Data":"54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62"} Jan 30 18:31:41 crc kubenswrapper[4782]: I0130 18:31:41.727334 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 18:31:41 crc kubenswrapper[4782]: E0130 18:31:41.740883 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 18:31:41 crc kubenswrapper[4782]: E0130 18:31:41.741137 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xhkz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vg9pw_openshift-marketplace(c8ded13e-75b7-4dc5-9b72-631f07ab52da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:41 crc kubenswrapper[4782]: E0130 18:31:41.742449 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vg9pw" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.234825 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62 is running failed: container process not found" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.235524 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62 is running failed: container process not found" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.235964 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62 is running failed: container process not found" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.236005 4782 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:42 crc kubenswrapper[4782]: I0130 18:31:42.327968 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mdlt5" Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.622087 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.622328 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmgng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bnbhn_openshift-marketplace(913d2663-2aea-4ac0-98bc-eb817aee0f98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:42 crc kubenswrapper[4782]: E0130 18:31:42.623473 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bnbhn" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" Jan 30 18:31:42 crc kubenswrapper[4782]: I0130 18:31:42.635618 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:31:42 crc kubenswrapper[4782]: I0130 18:31:42.635870 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" containerID="cri-o://08d773ca39566ae2c56c91e635802de7c7ddc86cd75bda0d6cb6fc8d4b81df12" gracePeriod=30 Jan 30 18:31:42 crc kubenswrapper[4782]: I0130 18:31:42.740718 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:31:42 crc kubenswrapper[4782]: I0130 18:31:42.741035 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerName="route-controller-manager" containerID="cri-o://6a7a4c929ffc1bae1ab30059cf19a8b50e583907785398bf1fd12072d8fbce73" gracePeriod=30 Jan 30 18:31:43 crc kubenswrapper[4782]: I0130 18:31:43.445096 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 30 18:31:43 crc kubenswrapper[4782]: E0130 18:31:43.996900 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 30 18:31:44 crc kubenswrapper[4782]: E0130 18:31:43.997162 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxb47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-94k6x_openshift-marketplace(76621964-5fa4-4a75-b7d2-9f148a3c701f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:44 crc kubenswrapper[4782]: E0130 18:31:43.998533 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-94k6x" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" Jan 30 18:31:44 crc kubenswrapper[4782]: I0130 18:31:44.870904 4782 generic.go:334] "Generic (PLEG): container finished" podID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerID="6a7a4c929ffc1bae1ab30059cf19a8b50e583907785398bf1fd12072d8fbce73" exitCode=0 Jan 30 18:31:44 crc kubenswrapper[4782]: I0130 18:31:44.871000 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" event={"ID":"f0e4561f-3acb-40aa-86fe-9fb86a840e31","Type":"ContainerDied","Data":"6a7a4c929ffc1bae1ab30059cf19a8b50e583907785398bf1fd12072d8fbce73"} Jan 30 18:31:44 crc kubenswrapper[4782]: I0130 18:31:44.873919 4782 generic.go:334] "Generic (PLEG): container finished" podID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerID="08d773ca39566ae2c56c91e635802de7c7ddc86cd75bda0d6cb6fc8d4b81df12" exitCode=0 Jan 30 18:31:44 crc kubenswrapper[4782]: I0130 18:31:44.873964 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" event={"ID":"6e4804cf-00c0-4598-9254-c5c424b013c2","Type":"ContainerDied","Data":"08d773ca39566ae2c56c91e635802de7c7ddc86cd75bda0d6cb6fc8d4b81df12"} Jan 30 18:31:44 crc kubenswrapper[4782]: I0130 18:31:44.893566 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.892692925 podStartE2EDuration="1.892692925s" podCreationTimestamp="2026-01-30 18:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:44.444426655 +0000 UTC m=+80.712804720" watchObservedRunningTime="2026-01-30 18:31:44.892692925 +0000 UTC m=+81.161070950" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.404598 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bnbhn" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.420464 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-94k6x" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.440739 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.441224 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6bhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l4jgx_openshift-marketplace(2eeba928-9384-4789-b6d2-dbc557b815d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.442642 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l4jgx" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.447303 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5xjfh_343fced1-cf4b-4d48-9a25-df17b608e09e/kube-multus-additional-cni-plugins/0.log" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.447418 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.454531 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.454762 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m7qpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jngb2_openshift-marketplace(93582a18-a665-4de9-b213-bae40598079d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.455973 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jngb2" podUID="93582a18-a665-4de9-b213-bae40598079d" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.609708 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8jwr\" (UniqueName: \"kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr\") pod \"343fced1-cf4b-4d48-9a25-df17b608e09e\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.609757 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir\") pod \"343fced1-cf4b-4d48-9a25-df17b608e09e\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.609887 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready\") pod \"343fced1-cf4b-4d48-9a25-df17b608e09e\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.609932 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist\") pod \"343fced1-cf4b-4d48-9a25-df17b608e09e\" (UID: \"343fced1-cf4b-4d48-9a25-df17b608e09e\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.610917 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready" (OuterVolumeSpecName: "ready") pod "343fced1-cf4b-4d48-9a25-df17b608e09e" (UID: "343fced1-cf4b-4d48-9a25-df17b608e09e"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.610991 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "343fced1-cf4b-4d48-9a25-df17b608e09e" (UID: "343fced1-cf4b-4d48-9a25-df17b608e09e"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.610011 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "343fced1-cf4b-4d48-9a25-df17b608e09e" (UID: "343fced1-cf4b-4d48-9a25-df17b608e09e"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.618626 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr" (OuterVolumeSpecName: "kube-api-access-p8jwr") pod "343fced1-cf4b-4d48-9a25-df17b608e09e" (UID: "343fced1-cf4b-4d48-9a25-df17b608e09e"). InnerVolumeSpecName "kube-api-access-p8jwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.712355 4782 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/343fced1-cf4b-4d48-9a25-df17b608e09e-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.712941 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8jwr\" (UniqueName: \"kubernetes.io/projected/343fced1-cf4b-4d48-9a25-df17b608e09e-kube-api-access-p8jwr\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.712955 4782 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/343fced1-cf4b-4d48-9a25-df17b608e09e-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.712972 4782 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/343fced1-cf4b-4d48-9a25-df17b608e09e-ready\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.827770 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.892250 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" event={"ID":"6e4804cf-00c0-4598-9254-c5c424b013c2","Type":"ContainerDied","Data":"556ff1980a2954b5b1f7261b1c6750bab42af385dd0e11551a974912148cb9b4"} Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.892668 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556ff1980a2954b5b1f7261b1c6750bab42af385dd0e11551a974912148cb9b4" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.898710 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-5xjfh_343fced1-cf4b-4d48-9a25-df17b608e09e/kube-multus-additional-cni-plugins/0.log" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.899129 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" event={"ID":"343fced1-cf4b-4d48-9a25-df17b608e09e","Type":"ContainerDied","Data":"a73fd03297bd2cbef752c9abfa5674cc212696cfc51a74bc2aa7bbc6a924786b"} Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.899414 4782 scope.go:117] "RemoveContainer" containerID="54e502288a9316f8870950f2f6a1e4b232ee89ef93773572cde15f9e0c127e62" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.899863 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-5xjfh" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.904986 4782 generic.go:334] "Generic (PLEG): container finished" podID="30838f0d-efae-4fdf-b098-14537a312bb3" containerID="d4a7859cf4883277176a235d3def55c7a037b8b0d5fc1646bb4b4b2a3e1e18a1" exitCode=0 Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.905048 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerDied","Data":"d4a7859cf4883277176a235d3def55c7a037b8b0d5fc1646bb4b4b2a3e1e18a1"} Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.908671 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" event={"ID":"f0e4561f-3acb-40aa-86fe-9fb86a840e31","Type":"ContainerDied","Data":"8ab7dc58c6394ffc1b0fb910dc0ca63b431ec77b10225f0d25ebee6263f5089e"} Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.908802 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.912817 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerStarted","Data":"32e8b499e61c0a107c3a942378bdfbbbdb9d422b3e24b3475724de876888ee99"} Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.913615 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jngb2" podUID="93582a18-a665-4de9-b213-bae40598079d" Jan 30 18:31:47 crc kubenswrapper[4782]: E0130 18:31:47.914811 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l4jgx" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.916070 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca\") pod \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.916198 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config\") pod \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.916259 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b\") pod \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.916289 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert\") pod \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\" (UID: \"f0e4561f-3acb-40aa-86fe-9fb86a840e31\") " Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.917479 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config" (OuterVolumeSpecName: "config") pod "f0e4561f-3acb-40aa-86fe-9fb86a840e31" (UID: "f0e4561f-3acb-40aa-86fe-9fb86a840e31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.919058 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca" (OuterVolumeSpecName: "client-ca") pod "f0e4561f-3acb-40aa-86fe-9fb86a840e31" (UID: "f0e4561f-3acb-40aa-86fe-9fb86a840e31"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.923774 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f0e4561f-3acb-40aa-86fe-9fb86a840e31" (UID: "f0e4561f-3acb-40aa-86fe-9fb86a840e31"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.924725 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b" (OuterVolumeSpecName: "kube-api-access-gnt8b") pod "f0e4561f-3acb-40aa-86fe-9fb86a840e31" (UID: "f0e4561f-3acb-40aa-86fe-9fb86a840e31"). InnerVolumeSpecName "kube-api-access-gnt8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.978707 4782 scope.go:117] "RemoveContainer" containerID="6a7a4c929ffc1bae1ab30059cf19a8b50e583907785398bf1fd12072d8fbce73" Jan 30 18:31:47 crc kubenswrapper[4782]: I0130 18:31:47.981102 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.018441 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.018923 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0e4561f-3acb-40aa-86fe-9fb86a840e31-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.018939 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0e4561f-3acb-40aa-86fe-9fb86a840e31-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.018951 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnt8b\" (UniqueName: \"kubernetes.io/projected/f0e4561f-3acb-40aa-86fe-9fb86a840e31-kube-api-access-gnt8b\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.030524 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5xjfh"] Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.035640 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-5xjfh"] Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.119788 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca\") pod \"6e4804cf-00c0-4598-9254-c5c424b013c2\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.119941 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles\") pod \"6e4804cf-00c0-4598-9254-c5c424b013c2\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.119973 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert\") pod \"6e4804cf-00c0-4598-9254-c5c424b013c2\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.120043 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqh7m\" (UniqueName: \"kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m\") pod \"6e4804cf-00c0-4598-9254-c5c424b013c2\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.120068 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config\") pod \"6e4804cf-00c0-4598-9254-c5c424b013c2\" (UID: \"6e4804cf-00c0-4598-9254-c5c424b013c2\") " Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.121552 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6e4804cf-00c0-4598-9254-c5c424b013c2" (UID: "6e4804cf-00c0-4598-9254-c5c424b013c2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.122129 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e4804cf-00c0-4598-9254-c5c424b013c2" (UID: "6e4804cf-00c0-4598-9254-c5c424b013c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.122675 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config" (OuterVolumeSpecName: "config") pod "6e4804cf-00c0-4598-9254-c5c424b013c2" (UID: "6e4804cf-00c0-4598-9254-c5c424b013c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.125724 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e4804cf-00c0-4598-9254-c5c424b013c2" (UID: "6e4804cf-00c0-4598-9254-c5c424b013c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.125750 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m" (OuterVolumeSpecName: "kube-api-access-wqh7m") pod "6e4804cf-00c0-4598-9254-c5c424b013c2" (UID: "6e4804cf-00c0-4598-9254-c5c424b013c2"). InnerVolumeSpecName "kube-api-access-wqh7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.221118 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.221142 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e4804cf-00c0-4598-9254-c5c424b013c2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.221154 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqh7m\" (UniqueName: \"kubernetes.io/projected/6e4804cf-00c0-4598-9254-c5c424b013c2-kube-api-access-wqh7m\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.221168 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.221179 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e4804cf-00c0-4598-9254-c5c424b013c2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.241058 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.243829 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-x4wsm"] Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.422201 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" path="/var/lib/kubelet/pods/343fced1-cf4b-4d48-9a25-df17b608e09e/volumes" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.423451 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" path="/var/lib/kubelet/pods/f0e4561f-3acb-40aa-86fe-9fb86a840e31/volumes" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.920633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerStarted","Data":"bcf6a913807a3b71609f18922b01a8ade8300f55a4bbf40ca42fbdbb87d4fed1"} Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.924880 4782 generic.go:334] "Generic (PLEG): container finished" podID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerID="32e8b499e61c0a107c3a942378bdfbbbdb9d422b3e24b3475724de876888ee99" exitCode=0 Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.924957 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerDied","Data":"32e8b499e61c0a107c3a942378bdfbbbdb9d422b3e24b3475724de876888ee99"} Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.925001 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerStarted","Data":"6b501a8d7f132473fd6c980dd9738f0d34b2d61b6a6a96dae6cb18cd28ae981a"} Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.926472 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xmfnp" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.945831 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5bldf" podStartSLOduration=3.243421926 podStartE2EDuration="38.945811739s" podCreationTimestamp="2026-01-30 18:31:10 +0000 UTC" firstStartedPulling="2026-01-30 18:31:12.596069871 +0000 UTC m=+48.864447896" lastFinishedPulling="2026-01-30 18:31:48.298459684 +0000 UTC m=+84.566837709" observedRunningTime="2026-01-30 18:31:48.945302396 +0000 UTC m=+85.213680431" watchObservedRunningTime="2026-01-30 18:31:48.945811739 +0000 UTC m=+85.214189774" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.967798 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ddhqf" podStartSLOduration=2.145768403 podStartE2EDuration="38.967777013s" podCreationTimestamp="2026-01-30 18:31:10 +0000 UTC" firstStartedPulling="2026-01-30 18:31:11.527481279 +0000 UTC m=+47.795859304" lastFinishedPulling="2026-01-30 18:31:48.349489869 +0000 UTC m=+84.617867914" observedRunningTime="2026-01-30 18:31:48.967106256 +0000 UTC m=+85.235484321" watchObservedRunningTime="2026-01-30 18:31:48.967777013 +0000 UTC m=+85.236155038" Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.981024 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:31:48 crc kubenswrapper[4782]: I0130 18:31:48.991734 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xmfnp"] Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287602 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 18:31:49 crc kubenswrapper[4782]: E0130 18:31:49.287813 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerName="route-controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287828 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerName="route-controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: E0130 18:31:49.287843 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287849 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: E0130 18:31:49.287859 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287865 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:49 crc kubenswrapper[4782]: E0130 18:31:49.287876 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287882 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: E0130 18:31:49.287895 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.287901 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288004 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" containerName="controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288014 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="343fced1-cf4b-4d48-9a25-df17b608e09e" containerName="kube-multus-additional-cni-plugins" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288022 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b72bc2e-3bb0-4cf6-905e-2f6989e2b8c1" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288030 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5d9cc7-7fcf-4ed4-b096-ae3f4b68b154" containerName="pruner" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288039 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e4561f-3acb-40aa-86fe-9fb86a840e31" containerName="route-controller-manager" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.288414 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.292707 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.292866 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.299831 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.438890 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.438987 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.540468 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.541103 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.540660 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.564089 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.607846 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.846917 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 18:31:49 crc kubenswrapper[4782]: I0130 18:31:49.937541 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3a0ce7f8-c7eb-4ead-b166-c909664846ac","Type":"ContainerStarted","Data":"390a1c6950b00979bb9e8415256f09b72110de506539b6c4089c1428d1ae3be9"} Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.366957 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.367803 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.369353 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.369764 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.370138 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.370385 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.370498 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.372569 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.375406 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.376164 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.377971 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.378121 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.378264 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.378267 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.378354 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.378507 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.380321 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.385944 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.388588 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.418441 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4804cf-00c0-4598-9254-c5c424b013c2" path="/var/lib/kubelet/pods/6e4804cf-00c0-4598-9254-c5c424b013c2/volumes" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457108 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457162 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457189 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th47v\" (UniqueName: \"kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457211 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457274 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dt25\" (UniqueName: \"kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457296 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457311 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457326 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.457344 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.528597 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.528691 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.558987 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dt25\" (UniqueName: \"kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.559044 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.559065 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.559080 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.559130 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560389 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560473 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560738 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560767 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560786 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th47v\" (UniqueName: \"kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.560805 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.561434 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.561846 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.562516 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.576124 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.576439 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.581072 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th47v\" (UniqueName: \"kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v\") pod \"controller-manager-6d7795ff57-b25nq\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.586385 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dt25\" (UniqueName: \"kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25\") pod \"route-controller-manager-57d96b979-xljtv\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.686998 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.687148 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.698650 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.917422 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.946134 4782 generic.go:334] "Generic (PLEG): container finished" podID="3a0ce7f8-c7eb-4ead-b166-c909664846ac" containerID="05f4e58430a26c7313b95291e03629a7a1fa8c98de41191b2c4d31d6df00129c" exitCode=0 Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.946252 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3a0ce7f8-c7eb-4ead-b166-c909664846ac","Type":"ContainerDied","Data":"05f4e58430a26c7313b95291e03629a7a1fa8c98de41191b2c4d31d6df00129c"} Jan 30 18:31:50 crc kubenswrapper[4782]: I0130 18:31:50.970651 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:31:50 crc kubenswrapper[4782]: W0130 18:31:50.977357 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b68ce0_487b_45b1_8afe_01c527033ddb.slice/crio-73152f5339d2b92e055c2792d5279ebcb3042e390335caff7e8af7b8ec51e701 WatchSource:0}: Error finding container 73152f5339d2b92e055c2792d5279ebcb3042e390335caff7e8af7b8ec51e701: Status 404 returned error can't find the container with id 73152f5339d2b92e055c2792d5279ebcb3042e390335caff7e8af7b8ec51e701 Jan 30 18:31:50 crc kubenswrapper[4782]: W0130 18:31:50.977572 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa1216f_28e2_485d_affa_f9795b56467d.slice/crio-3a9c715c866cff0a59c933de15d1bbc3c95308374d46d8361665c3516e6348eb WatchSource:0}: Error finding container 3a9c715c866cff0a59c933de15d1bbc3c95308374d46d8361665c3516e6348eb: Status 404 returned error can't find the container with id 3a9c715c866cff0a59c933de15d1bbc3c95308374d46d8361665c3516e6348eb Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.197314 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.197371 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.243572 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.953320 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" event={"ID":"aaa1216f-28e2-485d-affa-f9795b56467d","Type":"ContainerStarted","Data":"1def8a0bf6cfa6bcb3191ad4305d2125612c0daff8b44e0c0605a54e68ffb189"} Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.953762 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.953795 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" event={"ID":"aaa1216f-28e2-485d-affa-f9795b56467d","Type":"ContainerStarted","Data":"3a9c715c866cff0a59c933de15d1bbc3c95308374d46d8361665c3516e6348eb"} Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.955044 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" event={"ID":"76b68ce0-487b-45b1-8afe-01c527033ddb","Type":"ContainerStarted","Data":"2ccf842732f23fceefb5e20dfb399005c17bcea11b2be4fb062549cc928be3e1"} Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.955108 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" event={"ID":"76b68ce0-487b-45b1-8afe-01c527033ddb","Type":"ContainerStarted","Data":"73152f5339d2b92e055c2792d5279ebcb3042e390335caff7e8af7b8ec51e701"} Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.962424 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.972914 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" podStartSLOduration=9.972890422999999 podStartE2EDuration="9.972890423s" podCreationTimestamp="2026-01-30 18:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:51.970715349 +0000 UTC m=+88.239093394" watchObservedRunningTime="2026-01-30 18:31:51.972890423 +0000 UTC m=+88.241268448" Jan 30 18:31:51 crc kubenswrapper[4782]: I0130 18:31:51.994664 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" podStartSLOduration=9.994639312 podStartE2EDuration="9.994639312s" podCreationTimestamp="2026-01-30 18:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:51.987514455 +0000 UTC m=+88.255892480" watchObservedRunningTime="2026-01-30 18:31:51.994639312 +0000 UTC m=+88.263017337" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.201929 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.290135 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir\") pod \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.290322 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access\") pod \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\" (UID: \"3a0ce7f8-c7eb-4ead-b166-c909664846ac\") " Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.290334 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3a0ce7f8-c7eb-4ead-b166-c909664846ac" (UID: "3a0ce7f8-c7eb-4ead-b166-c909664846ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.290910 4782 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.301749 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3a0ce7f8-c7eb-4ead-b166-c909664846ac" (UID: "3a0ce7f8-c7eb-4ead-b166-c909664846ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.392596 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a0ce7f8-c7eb-4ead-b166-c909664846ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.963195 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3a0ce7f8-c7eb-4ead-b166-c909664846ac","Type":"ContainerDied","Data":"390a1c6950b00979bb9e8415256f09b72110de506539b6c4089c1428d1ae3be9"} Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.963284 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390a1c6950b00979bb9e8415256f09b72110de506539b6c4089c1428d1ae3be9" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.963334 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.963846 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:52 crc kubenswrapper[4782]: I0130 18:31:52.973150 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:31:54 crc kubenswrapper[4782]: I0130 18:31:54.975936 4782 generic.go:334] "Generic (PLEG): container finished" podID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerID="60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c" exitCode=0 Jan 30 18:31:54 crc kubenswrapper[4782]: I0130 18:31:54.976013 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerDied","Data":"60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c"} Jan 30 18:31:54 crc kubenswrapper[4782]: I0130 18:31:54.978600 4782 generic.go:334] "Generic (PLEG): container finished" podID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerID="0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8" exitCode=0 Jan 30 18:31:54 crc kubenswrapper[4782]: I0130 18:31:54.978741 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerDied","Data":"0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8"} Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.087705 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 18:31:56 crc kubenswrapper[4782]: E0130 18:31:56.088927 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a0ce7f8-c7eb-4ead-b166-c909664846ac" containerName="pruner" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.088948 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0ce7f8-c7eb-4ead-b166-c909664846ac" containerName="pruner" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.089159 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a0ce7f8-c7eb-4ead-b166-c909664846ac" containerName="pruner" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.089875 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.093257 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.093450 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.098628 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.145186 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.145332 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.145373 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.246990 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.247670 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.247108 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.247735 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.247845 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.271548 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access\") pod \"installer-9-crc\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.429794 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.872212 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.994649 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerStarted","Data":"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5"} Jan 30 18:31:56 crc kubenswrapper[4782]: I0130 18:31:56.997938 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerStarted","Data":"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab"} Jan 30 18:31:57 crc kubenswrapper[4782]: I0130 18:31:57.000287 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2","Type":"ContainerStarted","Data":"1e8c77b5f00d27be2621ef2f3f164d0903dbc4795a6a6b6a5eb3a6dda230bde7"} Jan 30 18:31:57 crc kubenswrapper[4782]: I0130 18:31:57.019751 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qg5cd" podStartSLOduration=3.691917941 podStartE2EDuration="47.019729846s" podCreationTimestamp="2026-01-30 18:31:10 +0000 UTC" firstStartedPulling="2026-01-30 18:31:12.638249356 +0000 UTC m=+48.906627381" lastFinishedPulling="2026-01-30 18:31:55.966061261 +0000 UTC m=+92.234439286" observedRunningTime="2026-01-30 18:31:57.014606249 +0000 UTC m=+93.282984274" watchObservedRunningTime="2026-01-30 18:31:57.019729846 +0000 UTC m=+93.288107881" Jan 30 18:31:57 crc kubenswrapper[4782]: I0130 18:31:57.037509 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vg9pw" podStartSLOduration=3.682084625 podStartE2EDuration="45.037490236s" podCreationTimestamp="2026-01-30 18:31:12 +0000 UTC" firstStartedPulling="2026-01-30 18:31:14.671506328 +0000 UTC m=+50.939884363" lastFinishedPulling="2026-01-30 18:31:56.026911949 +0000 UTC m=+92.295289974" observedRunningTime="2026-01-30 18:31:57.036969743 +0000 UTC m=+93.305347768" watchObservedRunningTime="2026-01-30 18:31:57.037490236 +0000 UTC m=+93.305868261" Jan 30 18:31:58 crc kubenswrapper[4782]: I0130 18:31:58.008389 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2","Type":"ContainerStarted","Data":"e1629227e5961f5bc4f2ddace2d8629dfc8ef54dacd8841b3418de077f9f6f6a"} Jan 30 18:31:58 crc kubenswrapper[4782]: I0130 18:31:58.028080 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.028058816 podStartE2EDuration="2.028058816s" podCreationTimestamp="2026-01-30 18:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:31:58.027797139 +0000 UTC m=+94.296175164" watchObservedRunningTime="2026-01-30 18:31:58.028058816 +0000 UTC m=+94.296436841" Jan 30 18:32:00 crc kubenswrapper[4782]: I0130 18:32:00.583197 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:32:00 crc kubenswrapper[4782]: I0130 18:32:00.733999 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:32:00 crc kubenswrapper[4782]: I0130 18:32:00.734043 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:32:00 crc kubenswrapper[4782]: I0130 18:32:00.773751 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:32:01 crc kubenswrapper[4782]: I0130 18:32:01.077706 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:32:01 crc kubenswrapper[4782]: I0130 18:32:01.240116 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.565063 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.565733 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" podUID="aaa1216f-28e2-485d-affa-f9795b56467d" containerName="controller-manager" containerID="cri-o://1def8a0bf6cfa6bcb3191ad4305d2125612c0daff8b44e0c0605a54e68ffb189" gracePeriod=30 Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.608655 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.609410 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" podUID="76b68ce0-487b-45b1-8afe-01c527033ddb" containerName="route-controller-manager" containerID="cri-o://2ccf842732f23fceefb5e20dfb399005c17bcea11b2be4fb062549cc928be3e1" gracePeriod=30 Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.904251 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.989855 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:02 crc kubenswrapper[4782]: I0130 18:32:02.989911 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.055747 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerStarted","Data":"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f"} Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.058335 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.060723 4782 generic.go:334] "Generic (PLEG): container finished" podID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerID="da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f" exitCode=0 Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.060819 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerDied","Data":"da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f"} Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.063510 4782 generic.go:334] "Generic (PLEG): container finished" podID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerID="5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514" exitCode=0 Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.063664 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerDied","Data":"5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514"} Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.067279 4782 generic.go:334] "Generic (PLEG): container finished" podID="aaa1216f-28e2-485d-affa-f9795b56467d" containerID="1def8a0bf6cfa6bcb3191ad4305d2125612c0daff8b44e0c0605a54e68ffb189" exitCode=0 Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.067379 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" event={"ID":"aaa1216f-28e2-485d-affa-f9795b56467d","Type":"ContainerDied","Data":"1def8a0bf6cfa6bcb3191ad4305d2125612c0daff8b44e0c0605a54e68ffb189"} Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.072749 4782 generic.go:334] "Generic (PLEG): container finished" podID="76b68ce0-487b-45b1-8afe-01c527033ddb" containerID="2ccf842732f23fceefb5e20dfb399005c17bcea11b2be4fb062549cc928be3e1" exitCode=0 Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.073994 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" event={"ID":"76b68ce0-487b-45b1-8afe-01c527033ddb","Type":"ContainerDied","Data":"2ccf842732f23fceefb5e20dfb399005c17bcea11b2be4fb062549cc928be3e1"} Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.137455 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.160035 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.232240 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.243536 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config\") pod \"aaa1216f-28e2-485d-affa-f9795b56467d\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.243596 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th47v\" (UniqueName: \"kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v\") pod \"aaa1216f-28e2-485d-affa-f9795b56467d\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.243664 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca\") pod \"aaa1216f-28e2-485d-affa-f9795b56467d\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.243718 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles\") pod \"aaa1216f-28e2-485d-affa-f9795b56467d\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.243770 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert\") pod \"aaa1216f-28e2-485d-affa-f9795b56467d\" (UID: \"aaa1216f-28e2-485d-affa-f9795b56467d\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.245661 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca" (OuterVolumeSpecName: "client-ca") pod "aaa1216f-28e2-485d-affa-f9795b56467d" (UID: "aaa1216f-28e2-485d-affa-f9795b56467d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.245883 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aaa1216f-28e2-485d-affa-f9795b56467d" (UID: "aaa1216f-28e2-485d-affa-f9795b56467d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.246152 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config" (OuterVolumeSpecName: "config") pod "aaa1216f-28e2-485d-affa-f9795b56467d" (UID: "aaa1216f-28e2-485d-affa-f9795b56467d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.256062 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v" (OuterVolumeSpecName: "kube-api-access-th47v") pod "aaa1216f-28e2-485d-affa-f9795b56467d" (UID: "aaa1216f-28e2-485d-affa-f9795b56467d"). InnerVolumeSpecName "kube-api-access-th47v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.257667 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aaa1216f-28e2-485d-affa-f9795b56467d" (UID: "aaa1216f-28e2-485d-affa-f9795b56467d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.344604 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert\") pod \"76b68ce0-487b-45b1-8afe-01c527033ddb\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.344684 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config\") pod \"76b68ce0-487b-45b1-8afe-01c527033ddb\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.344775 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca\") pod \"76b68ce0-487b-45b1-8afe-01c527033ddb\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.344798 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dt25\" (UniqueName: \"kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25\") pod \"76b68ce0-487b-45b1-8afe-01c527033ddb\" (UID: \"76b68ce0-487b-45b1-8afe-01c527033ddb\") " Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.345098 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.345114 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th47v\" (UniqueName: \"kubernetes.io/projected/aaa1216f-28e2-485d-affa-f9795b56467d-kube-api-access-th47v\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.345125 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.345134 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aaa1216f-28e2-485d-affa-f9795b56467d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.345143 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aaa1216f-28e2-485d-affa-f9795b56467d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.346720 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config" (OuterVolumeSpecName: "config") pod "76b68ce0-487b-45b1-8afe-01c527033ddb" (UID: "76b68ce0-487b-45b1-8afe-01c527033ddb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.346952 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca" (OuterVolumeSpecName: "client-ca") pod "76b68ce0-487b-45b1-8afe-01c527033ddb" (UID: "76b68ce0-487b-45b1-8afe-01c527033ddb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.348537 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76b68ce0-487b-45b1-8afe-01c527033ddb" (UID: "76b68ce0-487b-45b1-8afe-01c527033ddb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.348974 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25" (OuterVolumeSpecName: "kube-api-access-2dt25") pod "76b68ce0-487b-45b1-8afe-01c527033ddb" (UID: "76b68ce0-487b-45b1-8afe-01c527033ddb"). InnerVolumeSpecName "kube-api-access-2dt25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.446332 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.446546 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dt25\" (UniqueName: \"kubernetes.io/projected/76b68ce0-487b-45b1-8afe-01c527033ddb-kube-api-access-2dt25\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.446612 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76b68ce0-487b-45b1-8afe-01c527033ddb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:03 crc kubenswrapper[4782]: I0130 18:32:03.446703 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b68ce0-487b-45b1-8afe-01c527033ddb-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.081107 4782 generic.go:334] "Generic (PLEG): container finished" podID="93582a18-a665-4de9-b213-bae40598079d" containerID="9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f" exitCode=0 Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.081400 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerDied","Data":"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f"} Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.089610 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerStarted","Data":"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323"} Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.092258 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerStarted","Data":"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a"} Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.094575 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" event={"ID":"aaa1216f-28e2-485d-affa-f9795b56467d","Type":"ContainerDied","Data":"3a9c715c866cff0a59c933de15d1bbc3c95308374d46d8361665c3516e6348eb"} Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.094655 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7795ff57-b25nq" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.094664 4782 scope.go:117] "RemoveContainer" containerID="1def8a0bf6cfa6bcb3191ad4305d2125612c0daff8b44e0c0605a54e68ffb189" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.097076 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.097126 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv" event={"ID":"76b68ce0-487b-45b1-8afe-01c527033ddb","Type":"ContainerDied","Data":"73152f5339d2b92e055c2792d5279ebcb3042e390335caff7e8af7b8ec51e701"} Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.110798 4782 scope.go:117] "RemoveContainer" containerID="2ccf842732f23fceefb5e20dfb399005c17bcea11b2be4fb062549cc928be3e1" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.127679 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnbhn" podStartSLOduration=2.325972722 podStartE2EDuration="52.127653331s" podCreationTimestamp="2026-01-30 18:31:12 +0000 UTC" firstStartedPulling="2026-01-30 18:31:13.651951397 +0000 UTC m=+49.920329412" lastFinishedPulling="2026-01-30 18:32:03.453631996 +0000 UTC m=+99.722010021" observedRunningTime="2026-01-30 18:32:04.126526743 +0000 UTC m=+100.394904768" watchObservedRunningTime="2026-01-30 18:32:04.127653331 +0000 UTC m=+100.396031356" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.161741 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94k6x" podStartSLOduration=3.14353762 podStartE2EDuration="54.161720766s" podCreationTimestamp="2026-01-30 18:31:10 +0000 UTC" firstStartedPulling="2026-01-30 18:31:12.596333247 +0000 UTC m=+48.864711282" lastFinishedPulling="2026-01-30 18:32:03.614516403 +0000 UTC m=+99.882894428" observedRunningTime="2026-01-30 18:32:04.156455495 +0000 UTC m=+100.424833520" watchObservedRunningTime="2026-01-30 18:32:04.161720766 +0000 UTC m=+100.430098791" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.180666 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.183462 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d7795ff57-b25nq"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.190062 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.194997 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57d96b979-xljtv"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.379308 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:04 crc kubenswrapper[4782]: E0130 18:32:04.379750 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa1216f-28e2-485d-affa-f9795b56467d" containerName="controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.379786 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa1216f-28e2-485d-affa-f9795b56467d" containerName="controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: E0130 18:32:04.379819 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b68ce0-487b-45b1-8afe-01c527033ddb" containerName="route-controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.379828 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b68ce0-487b-45b1-8afe-01c527033ddb" containerName="route-controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.380028 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b68ce0-487b-45b1-8afe-01c527033ddb" containerName="route-controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.380057 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa1216f-28e2-485d-affa-f9795b56467d" containerName="controller-manager" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.380737 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.384126 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.384335 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.384759 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.385033 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.385342 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.385923 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.386888 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.388275 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.397191 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.397451 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.397462 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.397775 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.399571 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.399860 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.403326 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.406597 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.425088 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b68ce0-487b-45b1-8afe-01c527033ddb" path="/var/lib/kubelet/pods/76b68ce0-487b-45b1-8afe-01c527033ddb/volumes" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.426191 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa1216f-28e2-485d-affa-f9795b56467d" path="/var/lib/kubelet/pods/aaa1216f-28e2-485d-affa-f9795b56467d/volumes" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.426805 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465147 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465253 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465299 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465342 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsrl4\" (UniqueName: \"kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465367 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465385 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465441 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7lb\" (UniqueName: \"kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465462 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.465495 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566404 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566458 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsrl4\" (UniqueName: \"kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566487 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566554 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7lb\" (UniqueName: \"kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566574 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566623 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566660 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.566683 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.567527 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.568284 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.568291 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.568499 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.574820 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.574852 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.578444 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.588107 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsrl4\" (UniqueName: \"kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4\") pod \"controller-manager-7f584f5fbb-ccjdm\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.588806 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7lb\" (UniqueName: \"kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb\") pod \"route-controller-manager-6966978d95-b7z2n\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.708628 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.719518 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.745337 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.745618 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5bldf" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="registry-server" containerID="cri-o://bcf6a913807a3b71609f18922b01a8ade8300f55a4bbf40ca42fbdbb87d4fed1" gracePeriod=2 Jan 30 18:32:04 crc kubenswrapper[4782]: I0130 18:32:04.999216 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:05 crc kubenswrapper[4782]: W0130 18:32:05.018285 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd109cf52_def4_4062_9eca_f5375fee4776.slice/crio-d3833e997923e1152988a2050d4c88c5a49f5066d7450777733260bfa5729608 WatchSource:0}: Error finding container d3833e997923e1152988a2050d4c88c5a49f5066d7450777733260bfa5729608: Status 404 returned error can't find the container with id d3833e997923e1152988a2050d4c88c5a49f5066d7450777733260bfa5729608 Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.110944 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerStarted","Data":"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7"} Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.134088 4782 generic.go:334] "Generic (PLEG): container finished" podID="30838f0d-efae-4fdf-b098-14537a312bb3" containerID="bcf6a913807a3b71609f18922b01a8ade8300f55a4bbf40ca42fbdbb87d4fed1" exitCode=0 Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.134171 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerDied","Data":"bcf6a913807a3b71609f18922b01a8ade8300f55a4bbf40ca42fbdbb87d4fed1"} Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.134156 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jngb2" podStartSLOduration=6.178237848 podStartE2EDuration="52.134055773s" podCreationTimestamp="2026-01-30 18:31:13 +0000 UTC" firstStartedPulling="2026-01-30 18:31:18.551328332 +0000 UTC m=+54.819706357" lastFinishedPulling="2026-01-30 18:32:04.507146257 +0000 UTC m=+100.775524282" observedRunningTime="2026-01-30 18:32:05.127097881 +0000 UTC m=+101.395475926" watchObservedRunningTime="2026-01-30 18:32:05.134055773 +0000 UTC m=+101.402433798" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.141466 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerStarted","Data":"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745"} Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.144700 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" event={"ID":"d109cf52-def4-4062-9eca-f5375fee4776","Type":"ContainerStarted","Data":"d3833e997923e1152988a2050d4c88c5a49f5066d7450777733260bfa5729608"} Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.249554 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.320208 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.380443 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvrjg\" (UniqueName: \"kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg\") pod \"30838f0d-efae-4fdf-b098-14537a312bb3\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.380598 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content\") pod \"30838f0d-efae-4fdf-b098-14537a312bb3\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.380649 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities\") pod \"30838f0d-efae-4fdf-b098-14537a312bb3\" (UID: \"30838f0d-efae-4fdf-b098-14537a312bb3\") " Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.381541 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities" (OuterVolumeSpecName: "utilities") pod "30838f0d-efae-4fdf-b098-14537a312bb3" (UID: "30838f0d-efae-4fdf-b098-14537a312bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.381806 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.387745 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg" (OuterVolumeSpecName: "kube-api-access-lvrjg") pod "30838f0d-efae-4fdf-b098-14537a312bb3" (UID: "30838f0d-efae-4fdf-b098-14537a312bb3"). InnerVolumeSpecName "kube-api-access-lvrjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.453251 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30838f0d-efae-4fdf-b098-14537a312bb3" (UID: "30838f0d-efae-4fdf-b098-14537a312bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.483058 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30838f0d-efae-4fdf-b098-14537a312bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:05 crc kubenswrapper[4782]: I0130 18:32:05.483113 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvrjg\" (UniqueName: \"kubernetes.io/projected/30838f0d-efae-4fdf-b098-14537a312bb3-kube-api-access-lvrjg\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.155103 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" event={"ID":"d109cf52-def4-4062-9eca-f5375fee4776","Type":"ContainerStarted","Data":"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.155617 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.158825 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bldf" event={"ID":"30838f0d-efae-4fdf-b098-14537a312bb3","Type":"ContainerDied","Data":"391d73754a6681003005b71b363f0a7f14ce9ef503c7c8434fe6af1c979f7ec8"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.158881 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bldf" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.158923 4782 scope.go:117] "RemoveContainer" containerID="bcf6a913807a3b71609f18922b01a8ade8300f55a4bbf40ca42fbdbb87d4fed1" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.164100 4782 generic.go:334] "Generic (PLEG): container finished" podID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerID="fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745" exitCode=0 Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.164240 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerDied","Data":"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.164275 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerStarted","Data":"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.165417 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.169330 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" event={"ID":"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b","Type":"ContainerStarted","Data":"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.169376 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" event={"ID":"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b","Type":"ContainerStarted","Data":"8d63a4392b63bb937115985d2c5f8dcc66984453c1a986da5ea332bc9b4883f2"} Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.170467 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.178687 4782 scope.go:117] "RemoveContainer" containerID="d4a7859cf4883277176a235d3def55c7a037b8b0d5fc1646bb4b4b2a3e1e18a1" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.183624 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.189127 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" podStartSLOduration=4.189103512 podStartE2EDuration="4.189103512s" podCreationTimestamp="2026-01-30 18:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:32:06.188835746 +0000 UTC m=+102.457213781" watchObservedRunningTime="2026-01-30 18:32:06.189103512 +0000 UTC m=+102.457481537" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.232934 4782 scope.go:117] "RemoveContainer" containerID="f2d05d5832a56495d7c69b403ab23770f6ea2b72996fdc0eaa85875afe77a1be" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.262295 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" podStartSLOduration=4.262272056 podStartE2EDuration="4.262272056s" podCreationTimestamp="2026-01-30 18:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:32:06.257927008 +0000 UTC m=+102.526305033" watchObservedRunningTime="2026-01-30 18:32:06.262272056 +0000 UTC m=+102.530650081" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.263301 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4jgx" podStartSLOduration=6.146764679 podStartE2EDuration="53.263292031s" podCreationTimestamp="2026-01-30 18:31:13 +0000 UTC" firstStartedPulling="2026-01-30 18:31:18.551379343 +0000 UTC m=+54.819757398" lastFinishedPulling="2026-01-30 18:32:05.667906725 +0000 UTC m=+101.936284750" observedRunningTime="2026-01-30 18:32:06.238083796 +0000 UTC m=+102.506461841" watchObservedRunningTime="2026-01-30 18:32:06.263292031 +0000 UTC m=+102.531670066" Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.286546 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.291703 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5bldf"] Jan 30 18:32:06 crc kubenswrapper[4782]: I0130 18:32:06.418214 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" path="/var/lib/kubelet/pods/30838f0d-efae-4fdf-b098-14537a312bb3/volumes" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.140792 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.141093 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vg9pw" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="registry-server" containerID="cri-o://dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab" gracePeriod=2 Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.558636 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.613772 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content\") pod \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.613922 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities\") pod \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.614000 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhkz\" (UniqueName: \"kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz\") pod \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\" (UID: \"c8ded13e-75b7-4dc5-9b72-631f07ab52da\") " Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.616034 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities" (OuterVolumeSpecName: "utilities") pod "c8ded13e-75b7-4dc5-9b72-631f07ab52da" (UID: "c8ded13e-75b7-4dc5-9b72-631f07ab52da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.625391 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz" (OuterVolumeSpecName: "kube-api-access-7xhkz") pod "c8ded13e-75b7-4dc5-9b72-631f07ab52da" (UID: "c8ded13e-75b7-4dc5-9b72-631f07ab52da"). InnerVolumeSpecName "kube-api-access-7xhkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.652626 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8ded13e-75b7-4dc5-9b72-631f07ab52da" (UID: "c8ded13e-75b7-4dc5-9b72-631f07ab52da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.715287 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.715325 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8ded13e-75b7-4dc5-9b72-631f07ab52da-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:07 crc kubenswrapper[4782]: I0130 18:32:07.715340 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhkz\" (UniqueName: \"kubernetes.io/projected/c8ded13e-75b7-4dc5-9b72-631f07ab52da-kube-api-access-7xhkz\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.194178 4782 generic.go:334] "Generic (PLEG): container finished" podID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerID="dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab" exitCode=0 Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.197130 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerDied","Data":"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab"} Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.197595 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vg9pw" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.198694 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vg9pw" event={"ID":"c8ded13e-75b7-4dc5-9b72-631f07ab52da","Type":"ContainerDied","Data":"4a5badd8a31475a3e2f56bc49daf0ee0568a3b69e6ef58b18b082bbd0d67f7eb"} Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.199106 4782 scope.go:117] "RemoveContainer" containerID="dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.229309 4782 scope.go:117] "RemoveContainer" containerID="60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.269337 4782 scope.go:117] "RemoveContainer" containerID="f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.276895 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.283713 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vg9pw"] Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.288000 4782 scope.go:117] "RemoveContainer" containerID="dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab" Jan 30 18:32:08 crc kubenswrapper[4782]: E0130 18:32:08.288624 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab\": container with ID starting with dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab not found: ID does not exist" containerID="dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.288658 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab"} err="failed to get container status \"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab\": rpc error: code = NotFound desc = could not find container \"dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab\": container with ID starting with dad8d96b6829ba00a55f09c8add35c627732ab3683def971abd5cdc07ff5aeab not found: ID does not exist" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.288723 4782 scope.go:117] "RemoveContainer" containerID="60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c" Jan 30 18:32:08 crc kubenswrapper[4782]: E0130 18:32:08.289045 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c\": container with ID starting with 60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c not found: ID does not exist" containerID="60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.289067 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c"} err="failed to get container status \"60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c\": rpc error: code = NotFound desc = could not find container \"60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c\": container with ID starting with 60d7f3294649cd8e882e40c1ce3c5b75f76b20d2f1a077214d1e66e6fc0a681c not found: ID does not exist" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.289087 4782 scope.go:117] "RemoveContainer" containerID="f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415" Jan 30 18:32:08 crc kubenswrapper[4782]: E0130 18:32:08.289378 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415\": container with ID starting with f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415 not found: ID does not exist" containerID="f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.289411 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415"} err="failed to get container status \"f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415\": rpc error: code = NotFound desc = could not find container \"f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415\": container with ID starting with f247fe02805e394e0bbf4c77bef827c93f1d0f0dc8097f8c6d54b88f723a0415 not found: ID does not exist" Jan 30 18:32:08 crc kubenswrapper[4782]: I0130 18:32:08.417707 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" path="/var/lib/kubelet/pods/c8ded13e-75b7-4dc5-9b72-631f07ab52da/volumes" Jan 30 18:32:10 crc kubenswrapper[4782]: I0130 18:32:10.974665 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:10 crc kubenswrapper[4782]: I0130 18:32:10.975415 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:11 crc kubenswrapper[4782]: I0130 18:32:11.024426 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:11 crc kubenswrapper[4782]: I0130 18:32:11.261482 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:12 crc kubenswrapper[4782]: I0130 18:32:12.140441 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:32:12 crc kubenswrapper[4782]: I0130 18:32:12.550202 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:32:12 crc kubenswrapper[4782]: I0130 18:32:12.550384 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:32:12 crc kubenswrapper[4782]: I0130 18:32:12.611312 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.237045 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94k6x" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="registry-server" containerID="cri-o://f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323" gracePeriod=2 Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.292139 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.789704 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.915971 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content\") pod \"76621964-5fa4-4a75-b7d2-9f148a3c701f\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.916061 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities\") pod \"76621964-5fa4-4a75-b7d2-9f148a3c701f\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.916201 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxb47\" (UniqueName: \"kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47\") pod \"76621964-5fa4-4a75-b7d2-9f148a3c701f\" (UID: \"76621964-5fa4-4a75-b7d2-9f148a3c701f\") " Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.920075 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities" (OuterVolumeSpecName: "utilities") pod "76621964-5fa4-4a75-b7d2-9f148a3c701f" (UID: "76621964-5fa4-4a75-b7d2-9f148a3c701f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:13 crc kubenswrapper[4782]: I0130 18:32:13.927008 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47" (OuterVolumeSpecName: "kube-api-access-hxb47") pod "76621964-5fa4-4a75-b7d2-9f148a3c701f" (UID: "76621964-5fa4-4a75-b7d2-9f148a3c701f"). InnerVolumeSpecName "kube-api-access-hxb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.018700 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.018772 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxb47\" (UniqueName: \"kubernetes.io/projected/76621964-5fa4-4a75-b7d2-9f148a3c701f-kube-api-access-hxb47\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.021017 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76621964-5fa4-4a75-b7d2-9f148a3c701f" (UID: "76621964-5fa4-4a75-b7d2-9f148a3c701f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.033775 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.033868 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.111788 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.120285 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76621964-5fa4-4a75-b7d2-9f148a3c701f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.136422 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.137684 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.205634 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.247779 4782 generic.go:334] "Generic (PLEG): container finished" podID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerID="f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323" exitCode=0 Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.247861 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94k6x" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.247911 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerDied","Data":"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323"} Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.247950 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94k6x" event={"ID":"76621964-5fa4-4a75-b7d2-9f148a3c701f","Type":"ContainerDied","Data":"6726cb013c2eb79bf5e7a65b364ba38d3a531043adc006afadb9d079cfc610c6"} Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.247971 4782 scope.go:117] "RemoveContainer" containerID="f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.275861 4782 scope.go:117] "RemoveContainer" containerID="da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.291940 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.294515 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94k6x"] Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.309714 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.310447 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.318675 4782 scope.go:117] "RemoveContainer" containerID="1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.352650 4782 scope.go:117] "RemoveContainer" containerID="f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323" Jan 30 18:32:14 crc kubenswrapper[4782]: E0130 18:32:14.359168 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323\": container with ID starting with f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323 not found: ID does not exist" containerID="f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.359231 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323"} err="failed to get container status \"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323\": rpc error: code = NotFound desc = could not find container \"f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323\": container with ID starting with f8ce4d599d13105a5866ec740795091e7c863e7da196693d0aadd0b2f2bb6323 not found: ID does not exist" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.359312 4782 scope.go:117] "RemoveContainer" containerID="da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f" Jan 30 18:32:14 crc kubenswrapper[4782]: E0130 18:32:14.359933 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f\": container with ID starting with da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f not found: ID does not exist" containerID="da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.359990 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f"} err="failed to get container status \"da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f\": rpc error: code = NotFound desc = could not find container \"da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f\": container with ID starting with da874ff6528936f5835c4131388135f75adf68bfd7a7312e54cd0f22eb3d948f not found: ID does not exist" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.360029 4782 scope.go:117] "RemoveContainer" containerID="1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d" Jan 30 18:32:14 crc kubenswrapper[4782]: E0130 18:32:14.360373 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d\": container with ID starting with 1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d not found: ID does not exist" containerID="1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.360407 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d"} err="failed to get container status \"1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d\": rpc error: code = NotFound desc = could not find container \"1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d\": container with ID starting with 1b0525cc0a11de3dc0c043237053bc0522ec7482c8d21b8d520b9709f3f8130d not found: ID does not exist" Jan 30 18:32:14 crc kubenswrapper[4782]: I0130 18:32:14.439947 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" path="/var/lib/kubelet/pods/76621964-5fa4-4a75-b7d2-9f148a3c701f/volumes" Jan 30 18:32:16 crc kubenswrapper[4782]: I0130 18:32:16.548524 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:32:17 crc kubenswrapper[4782]: I0130 18:32:17.267361 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jngb2" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="registry-server" containerID="cri-o://c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7" gracePeriod=2 Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.084517 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.200223 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7qpv\" (UniqueName: \"kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv\") pod \"93582a18-a665-4de9-b213-bae40598079d\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.200341 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities\") pod \"93582a18-a665-4de9-b213-bae40598079d\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.200534 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content\") pod \"93582a18-a665-4de9-b213-bae40598079d\" (UID: \"93582a18-a665-4de9-b213-bae40598079d\") " Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.201944 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities" (OuterVolumeSpecName: "utilities") pod "93582a18-a665-4de9-b213-bae40598079d" (UID: "93582a18-a665-4de9-b213-bae40598079d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.209753 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv" (OuterVolumeSpecName: "kube-api-access-m7qpv") pod "93582a18-a665-4de9-b213-bae40598079d" (UID: "93582a18-a665-4de9-b213-bae40598079d"). InnerVolumeSpecName "kube-api-access-m7qpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.278769 4782 generic.go:334] "Generic (PLEG): container finished" podID="93582a18-a665-4de9-b213-bae40598079d" containerID="c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7" exitCode=0 Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.278821 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerDied","Data":"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7"} Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.278861 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jngb2" event={"ID":"93582a18-a665-4de9-b213-bae40598079d","Type":"ContainerDied","Data":"22a68c0e6eec6f2800c0b3ecf62ed44fc561c88638f8eafa589acea675859e8c"} Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.278882 4782 scope.go:117] "RemoveContainer" containerID="c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.278911 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jngb2" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.295640 4782 scope.go:117] "RemoveContainer" containerID="9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.301860 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7qpv\" (UniqueName: \"kubernetes.io/projected/93582a18-a665-4de9-b213-bae40598079d-kube-api-access-m7qpv\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.301899 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.316394 4782 scope.go:117] "RemoveContainer" containerID="0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.332825 4782 scope.go:117] "RemoveContainer" containerID="c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7" Jan 30 18:32:18 crc kubenswrapper[4782]: E0130 18:32:18.333488 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7\": container with ID starting with c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7 not found: ID does not exist" containerID="c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.333522 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7"} err="failed to get container status \"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7\": rpc error: code = NotFound desc = could not find container \"c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7\": container with ID starting with c748598a8a6b7f78b8f5ac58f848135fa374f5be27cce70cdbd19d5b4fc82ad7 not found: ID does not exist" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.333549 4782 scope.go:117] "RemoveContainer" containerID="9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f" Jan 30 18:32:18 crc kubenswrapper[4782]: E0130 18:32:18.333883 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f\": container with ID starting with 9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f not found: ID does not exist" containerID="9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.333902 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f"} err="failed to get container status \"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f\": rpc error: code = NotFound desc = could not find container \"9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f\": container with ID starting with 9c4ff6b094751f39aaad02f7f11351a4165254d2a4383edfe16855b14a3fa89f not found: ID does not exist" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.333916 4782 scope.go:117] "RemoveContainer" containerID="0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d" Jan 30 18:32:18 crc kubenswrapper[4782]: E0130 18:32:18.334310 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d\": container with ID starting with 0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d not found: ID does not exist" containerID="0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.334332 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d"} err="failed to get container status \"0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d\": rpc error: code = NotFound desc = could not find container \"0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d\": container with ID starting with 0fa2253c01f43f45250f8865dfccdbb310d1bc9fa63abfbfa811350cfadba07d not found: ID does not exist" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.670425 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93582a18-a665-4de9-b213-bae40598079d" (UID: "93582a18-a665-4de9-b213-bae40598079d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.714216 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93582a18-a665-4de9-b213-bae40598079d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.912155 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:32:18 crc kubenswrapper[4782]: I0130 18:32:18.917378 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jngb2"] Jan 30 18:32:20 crc kubenswrapper[4782]: I0130 18:32:20.424518 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93582a18-a665-4de9-b213-bae40598079d" path="/var/lib/kubelet/pods/93582a18-a665-4de9-b213-bae40598079d/volumes" Jan 30 18:32:22 crc kubenswrapper[4782]: I0130 18:32:22.592424 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:22 crc kubenswrapper[4782]: I0130 18:32:22.593355 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" podUID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" containerName="controller-manager" containerID="cri-o://436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef" gracePeriod=30 Jan 30 18:32:22 crc kubenswrapper[4782]: I0130 18:32:22.684152 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:22 crc kubenswrapper[4782]: I0130 18:32:22.684539 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" podUID="d109cf52-def4-4062-9eca-f5375fee4776" containerName="route-controller-manager" containerID="cri-o://e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42" gracePeriod=30 Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.202399 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.286337 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert\") pod \"d109cf52-def4-4062-9eca-f5375fee4776\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.286395 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca\") pod \"d109cf52-def4-4062-9eca-f5375fee4776\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.286428 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config\") pod \"d109cf52-def4-4062-9eca-f5375fee4776\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.286450 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl7lb\" (UniqueName: \"kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb\") pod \"d109cf52-def4-4062-9eca-f5375fee4776\" (UID: \"d109cf52-def4-4062-9eca-f5375fee4776\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.287582 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca" (OuterVolumeSpecName: "client-ca") pod "d109cf52-def4-4062-9eca-f5375fee4776" (UID: "d109cf52-def4-4062-9eca-f5375fee4776"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.288407 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config" (OuterVolumeSpecName: "config") pod "d109cf52-def4-4062-9eca-f5375fee4776" (UID: "d109cf52-def4-4062-9eca-f5375fee4776"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.295187 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d109cf52-def4-4062-9eca-f5375fee4776" (UID: "d109cf52-def4-4062-9eca-f5375fee4776"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.295294 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb" (OuterVolumeSpecName: "kube-api-access-fl7lb") pod "d109cf52-def4-4062-9eca-f5375fee4776" (UID: "d109cf52-def4-4062-9eca-f5375fee4776"). InnerVolumeSpecName "kube-api-access-fl7lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.308939 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.315427 4782 generic.go:334] "Generic (PLEG): container finished" podID="d109cf52-def4-4062-9eca-f5375fee4776" containerID="e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42" exitCode=0 Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.315496 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" event={"ID":"d109cf52-def4-4062-9eca-f5375fee4776","Type":"ContainerDied","Data":"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42"} Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.315528 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" event={"ID":"d109cf52-def4-4062-9eca-f5375fee4776","Type":"ContainerDied","Data":"d3833e997923e1152988a2050d4c88c5a49f5066d7450777733260bfa5729608"} Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.315547 4782 scope.go:117] "RemoveContainer" containerID="e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.315721 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.318022 4782 generic.go:334] "Generic (PLEG): container finished" podID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" containerID="436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef" exitCode=0 Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.318080 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" event={"ID":"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b","Type":"ContainerDied","Data":"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef"} Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.318125 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" event={"ID":"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b","Type":"ContainerDied","Data":"8d63a4392b63bb937115985d2c5f8dcc66984453c1a986da5ea332bc9b4883f2"} Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.318203 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.332691 4782 scope.go:117] "RemoveContainer" containerID="e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42" Jan 30 18:32:23 crc kubenswrapper[4782]: E0130 18:32:23.333318 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42\": container with ID starting with e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42 not found: ID does not exist" containerID="e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.333367 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42"} err="failed to get container status \"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42\": rpc error: code = NotFound desc = could not find container \"e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42\": container with ID starting with e25692f631d863dcb5a71bf0f0f7618f72866aa55692b0c47dea65f6f9645b42 not found: ID does not exist" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.333402 4782 scope.go:117] "RemoveContainer" containerID="436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.364434 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.366571 4782 scope.go:117] "RemoveContainer" containerID="436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef" Jan 30 18:32:23 crc kubenswrapper[4782]: E0130 18:32:23.367272 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef\": container with ID starting with 436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef not found: ID does not exist" containerID="436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.367422 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef"} err="failed to get container status \"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef\": rpc error: code = NotFound desc = could not find container \"436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef\": container with ID starting with 436ee907af97a2fa72dd5794a7fc4f1d9727e4674e1b606755528369765c1aef not found: ID does not exist" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.367670 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6966978d95-b7z2n"] Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.388822 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d109cf52-def4-4062-9eca-f5375fee4776-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.388862 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.388895 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d109cf52-def4-4062-9eca-f5375fee4776-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.388907 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl7lb\" (UniqueName: \"kubernetes.io/projected/d109cf52-def4-4062-9eca-f5375fee4776-kube-api-access-fl7lb\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.490434 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca\") pod \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.490545 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config\") pod \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.490601 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsrl4\" (UniqueName: \"kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4\") pod \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.490657 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles\") pod \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.490699 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert\") pod \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\" (UID: \"da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b\") " Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.491372 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca" (OuterVolumeSpecName: "client-ca") pod "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" (UID: "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.491882 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" (UID: "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.491941 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config" (OuterVolumeSpecName: "config") pod "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" (UID: "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.496316 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" (UID: "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.496411 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4" (OuterVolumeSpecName: "kube-api-access-vsrl4") pod "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" (UID: "da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b"). InnerVolumeSpecName "kube-api-access-vsrl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.592048 4782 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.592205 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.592384 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsrl4\" (UniqueName: \"kubernetes.io/projected/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-kube-api-access-vsrl4\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.592428 4782 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.592440 4782 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.645305 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:23 crc kubenswrapper[4782]: I0130 18:32:23.654720 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f584f5fbb-ccjdm"] Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.398447 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d689474fb-tbbl7"] Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.398974 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399017 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399055 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399071 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399094 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399136 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399155 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399174 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399196 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d109cf52-def4-4062-9eca-f5375fee4776" containerName="route-controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399211 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d109cf52-def4-4062-9eca-f5375fee4776" containerName="route-controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399270 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399286 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399306 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399321 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399341 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399355 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="extract-utilities" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399373 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" containerName="controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399388 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" containerName="controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399443 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399461 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399478 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399494 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399514 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399529 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399549 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399565 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="extract-content" Jan 30 18:32:24 crc kubenswrapper[4782]: E0130 18:32:24.399585 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399601 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399857 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" containerName="controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399888 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d109cf52-def4-4062-9eca-f5375fee4776" containerName="route-controller-manager" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399936 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ded13e-75b7-4dc5-9b72-631f07ab52da" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399963 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="93582a18-a665-4de9-b213-bae40598079d" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.399984 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="30838f0d-efae-4fdf-b098-14537a312bb3" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.400017 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="76621964-5fa4-4a75-b7d2-9f148a3c701f" containerName="registry-server" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.400955 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.401165 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94"] Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.402161 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.406887 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.407317 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.407508 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.407705 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.418037 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.424899 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.427757 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.428936 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.436749 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.436920 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.437954 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.442393 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.458801 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.499805 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d109cf52-def4-4062-9eca-f5375fee4776" path="/var/lib/kubelet/pods/d109cf52-def4-4062-9eca-f5375fee4776/volumes" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.500726 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b" path="/var/lib/kubelet/pods/da8e68ee-b2f5-4679-8af9-a5cd3bd6a90b/volumes" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.501328 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94"] Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.501369 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d689474fb-tbbl7"] Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504147 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-proxy-ca-bundles\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504482 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm6bh\" (UniqueName: \"kubernetes.io/projected/d3e9c427-9f43-414d-b36f-2f482086117a-kube-api-access-fm6bh\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504533 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-client-ca\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504568 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-config\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504628 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-client-ca\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504660 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwdhz\" (UniqueName: \"kubernetes.io/projected/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-kube-api-access-bwdhz\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504690 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-serving-cert\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504736 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e9c427-9f43-414d-b36f-2f482086117a-serving-cert\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.504771 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-config\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.605737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm6bh\" (UniqueName: \"kubernetes.io/projected/d3e9c427-9f43-414d-b36f-2f482086117a-kube-api-access-fm6bh\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.605847 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-client-ca\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.605914 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-config\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.605990 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-client-ca\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.606025 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwdhz\" (UniqueName: \"kubernetes.io/projected/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-kube-api-access-bwdhz\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.606071 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-serving-cert\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.606108 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e9c427-9f43-414d-b36f-2f482086117a-serving-cert\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.606160 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-config\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.606262 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-proxy-ca-bundles\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.607055 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-client-ca\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.607788 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-config\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.608313 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-client-ca\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.609222 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-config\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.611428 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3e9c427-9f43-414d-b36f-2f482086117a-proxy-ca-bundles\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.614980 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-serving-cert\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.616516 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e9c427-9f43-414d-b36f-2f482086117a-serving-cert\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.635367 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm6bh\" (UniqueName: \"kubernetes.io/projected/d3e9c427-9f43-414d-b36f-2f482086117a-kube-api-access-fm6bh\") pod \"controller-manager-7d689474fb-tbbl7\" (UID: \"d3e9c427-9f43-414d-b36f-2f482086117a\") " pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.638308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwdhz\" (UniqueName: \"kubernetes.io/projected/c7f850e3-86e1-4f5a-9f00-9442cdd0aec2-kube-api-access-bwdhz\") pod \"route-controller-manager-6fff9856f4-2gs94\" (UID: \"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2\") " pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.776138 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:24 crc kubenswrapper[4782]: I0130 18:32:24.787792 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.080143 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d689474fb-tbbl7"] Jan 30 18:32:25 crc kubenswrapper[4782]: W0130 18:32:25.101392 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e9c427_9f43_414d_b36f_2f482086117a.slice/crio-e32c1df060dc8c5b82636b08c5bc4217233713c5d42d546016a850d75830ec0e WatchSource:0}: Error finding container e32c1df060dc8c5b82636b08c5bc4217233713c5d42d546016a850d75830ec0e: Status 404 returned error can't find the container with id e32c1df060dc8c5b82636b08c5bc4217233713c5d42d546016a850d75830ec0e Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.212591 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94"] Jan 30 18:32:25 crc kubenswrapper[4782]: W0130 18:32:25.221890 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f850e3_86e1_4f5a_9f00_9442cdd0aec2.slice/crio-12ed760296b04e6491178950d34133aaab0c02cfafdfb99fff0b847f54867712 WatchSource:0}: Error finding container 12ed760296b04e6491178950d34133aaab0c02cfafdfb99fff0b847f54867712: Status 404 returned error can't find the container with id 12ed760296b04e6491178950d34133aaab0c02cfafdfb99fff0b847f54867712 Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.338959 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" event={"ID":"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2","Type":"ContainerStarted","Data":"12ed760296b04e6491178950d34133aaab0c02cfafdfb99fff0b847f54867712"} Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.340723 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" event={"ID":"d3e9c427-9f43-414d-b36f-2f482086117a","Type":"ContainerStarted","Data":"320ee64905964e2157e006918a5737aaee795f834bf48a550ebed92bec27c838"} Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.340891 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" event={"ID":"d3e9c427-9f43-414d-b36f-2f482086117a","Type":"ContainerStarted","Data":"e32c1df060dc8c5b82636b08c5bc4217233713c5d42d546016a850d75830ec0e"} Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.341116 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.343494 4782 patch_prober.go:28] interesting pod/controller-manager-7d689474fb-tbbl7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.343554 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" podUID="d3e9c427-9f43-414d-b36f-2f482086117a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Jan 30 18:32:25 crc kubenswrapper[4782]: I0130 18:32:25.358248 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" podStartSLOduration=3.358205196 podStartE2EDuration="3.358205196s" podCreationTimestamp="2026-01-30 18:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:32:25.357625522 +0000 UTC m=+121.626003547" watchObservedRunningTime="2026-01-30 18:32:25.358205196 +0000 UTC m=+121.626583211" Jan 30 18:32:26 crc kubenswrapper[4782]: I0130 18:32:26.349295 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" event={"ID":"c7f850e3-86e1-4f5a-9f00-9442cdd0aec2","Type":"ContainerStarted","Data":"72eba2cc914bbbf1b0b0db28314ea89b35e9938718e0aeb343bf190c044b9712"} Jan 30 18:32:26 crc kubenswrapper[4782]: I0130 18:32:26.355762 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d689474fb-tbbl7" Jan 30 18:32:26 crc kubenswrapper[4782]: I0130 18:32:26.367850 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" podStartSLOduration=4.367828888 podStartE2EDuration="4.367828888s" podCreationTimestamp="2026-01-30 18:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:32:26.366387433 +0000 UTC m=+122.634765458" watchObservedRunningTime="2026-01-30 18:32:26.367828888 +0000 UTC m=+122.636206933" Jan 30 18:32:27 crc kubenswrapper[4782]: I0130 18:32:27.358736 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:27 crc kubenswrapper[4782]: I0130 18:32:27.366596 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fff9856f4-2gs94" Jan 30 18:32:27 crc kubenswrapper[4782]: I0130 18:32:27.953220 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" podUID="059df750-c2da-429e-bae4-c7271be158af" containerName="oauth-openshift" containerID="cri-o://8a83dfb7af9effd6c233a77b6add42ce363a718a8b333e71ebd1f1d115a41859" gracePeriod=15 Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.368613 4782 generic.go:334] "Generic (PLEG): container finished" podID="059df750-c2da-429e-bae4-c7271be158af" containerID="8a83dfb7af9effd6c233a77b6add42ce363a718a8b333e71ebd1f1d115a41859" exitCode=0 Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.368754 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" event={"ID":"059df750-c2da-429e-bae4-c7271be158af","Type":"ContainerDied","Data":"8a83dfb7af9effd6c233a77b6add42ce363a718a8b333e71ebd1f1d115a41859"} Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.506194 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669359 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669426 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669475 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669501 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669526 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669558 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669607 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flvgz\" (UniqueName: \"kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669637 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669665 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669723 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669785 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669812 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669839 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.669871 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir\") pod \"059df750-c2da-429e-bae4-c7271be158af\" (UID: \"059df750-c2da-429e-bae4-c7271be158af\") " Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.670264 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.670590 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.671495 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.672049 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.673420 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.678417 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.680126 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz" (OuterVolumeSpecName: "kube-api-access-flvgz") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "kube-api-access-flvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.680313 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.681355 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.682630 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.683470 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.683765 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.684519 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.684819 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "059df750-c2da-429e-bae4-c7271be158af" (UID: "059df750-c2da-429e-bae4-c7271be158af"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771125 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flvgz\" (UniqueName: \"kubernetes.io/projected/059df750-c2da-429e-bae4-c7271be158af-kube-api-access-flvgz\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771186 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771203 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771219 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771251 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771261 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771270 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771281 4782 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/059df750-c2da-429e-bae4-c7271be158af-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771290 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771299 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771308 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771331 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771340 4782 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/059df750-c2da-429e-bae4-c7271be158af-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:28 crc kubenswrapper[4782]: I0130 18:32:28.771348 4782 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/059df750-c2da-429e-bae4-c7271be158af-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:29 crc kubenswrapper[4782]: I0130 18:32:29.382075 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" Jan 30 18:32:29 crc kubenswrapper[4782]: I0130 18:32:29.383602 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gsv8k" event={"ID":"059df750-c2da-429e-bae4-c7271be158af","Type":"ContainerDied","Data":"82d0fe0831385987ff1153df90293e9515a66987a7b29f2da49b2c7a726f816b"} Jan 30 18:32:29 crc kubenswrapper[4782]: I0130 18:32:29.383655 4782 scope.go:117] "RemoveContainer" containerID="8a83dfb7af9effd6c233a77b6add42ce363a718a8b333e71ebd1f1d115a41859" Jan 30 18:32:29 crc kubenswrapper[4782]: I0130 18:32:29.446445 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:32:29 crc kubenswrapper[4782]: I0130 18:32:29.452420 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gsv8k"] Jan 30 18:32:30 crc kubenswrapper[4782]: I0130 18:32:30.417876 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059df750-c2da-429e-bae4-c7271be158af" path="/var/lib/kubelet/pods/059df750-c2da-429e-bae4-c7271be158af/volumes" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.980113 4782 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.980925 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059df750-c2da-429e-bae4-c7271be158af" containerName="oauth-openshift" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.980954 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="059df750-c2da-429e-bae4-c7271be158af" containerName="oauth-openshift" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.981200 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="059df750-c2da-429e-bae4-c7271be158af" containerName="oauth-openshift" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.981904 4782 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982218 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982551 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007" gracePeriod=15 Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982582 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab" gracePeriod=15 Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982626 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98" gracePeriod=15 Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982671 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da" gracePeriod=15 Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.982592 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9" gracePeriod=15 Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.985889 4782 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.986408 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.986598 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.986769 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.986975 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.987209 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.987434 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.987615 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.987783 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.987959 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.988144 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.988379 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.988559 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 18:32:34 crc kubenswrapper[4782]: E0130 18:32:34.988730 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.988887 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.989855 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.990099 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.990628 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.990915 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.991098 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 18:32:34 crc kubenswrapper[4782]: I0130 18:32:34.991319 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.062092 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.062726 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.063056 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.063357 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.063645 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.063869 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.064090 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.064360 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.165905 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.165954 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.165983 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166022 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166041 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166062 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166082 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166101 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166168 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166209 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166246 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166267 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166291 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166310 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166331 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.166351 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.443657 4782 generic.go:334] "Generic (PLEG): container finished" podID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" containerID="e1629227e5961f5bc4f2ddace2d8629dfc8ef54dacd8841b3418de077f9f6f6a" exitCode=0 Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.443848 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2","Type":"ContainerDied","Data":"e1629227e5961f5bc4f2ddace2d8629dfc8ef54dacd8841b3418de077f9f6f6a"} Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.445416 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.445844 4782 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.447702 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.449947 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.451408 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab" exitCode=0 Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.451466 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98" exitCode=0 Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.451487 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9" exitCode=0 Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.451514 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da" exitCode=2 Jan 30 18:32:35 crc kubenswrapper[4782]: I0130 18:32:35.451609 4782 scope.go:117] "RemoveContainer" containerID="349ae9462f0e179adfeddb396ee8e9dcc07ca74c2083a420366b015d43e73805" Jan 30 18:32:36 crc kubenswrapper[4782]: I0130 18:32:36.466902 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 18:32:36 crc kubenswrapper[4782]: I0130 18:32:36.998654 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:36.999947 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.095851 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access\") pod \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.096382 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir\") pod \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.096444 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock\") pod \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\" (UID: \"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.096804 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock" (OuterVolumeSpecName: "var-lock") pod "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" (UID: "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.096854 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" (UID: "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.139816 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" (UID: "ae9c4fd4-7593-46e0-b333-bf967f2cc1a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.198769 4782 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.198820 4782 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.198838 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae9c4fd4-7593-46e0-b333-bf967f2cc1a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.458077 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.459806 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.460767 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.461339 4782 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.480464 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.480495 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ae9c4fd4-7593-46e0-b333-bf967f2cc1a2","Type":"ContainerDied","Data":"1e8c77b5f00d27be2621ef2f3f164d0903dbc4795a6a6b6a5eb3a6dda230bde7"} Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.480576 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8c77b5f00d27be2621ef2f3f164d0903dbc4795a6a6b6a5eb3a6dda230bde7" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.486396 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.488683 4782 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007" exitCode=0 Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.488800 4782 scope.go:117] "RemoveContainer" containerID="6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.488825 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.508947 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.509944 4782 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.523115 4782 scope.go:117] "RemoveContainer" containerID="7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.538016 4782 scope.go:117] "RemoveContainer" containerID="f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.557163 4782 scope.go:117] "RemoveContainer" containerID="1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.581287 4782 scope.go:117] "RemoveContainer" containerID="967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.598635 4782 scope.go:117] "RemoveContainer" containerID="338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603588 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603745 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603780 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603786 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603889 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.603936 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.604115 4782 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.604152 4782 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.604175 4782 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.623572 4782 scope.go:117] "RemoveContainer" containerID="6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.624090 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\": container with ID starting with 6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab not found: ID does not exist" containerID="6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.624146 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab"} err="failed to get container status \"6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\": rpc error: code = NotFound desc = could not find container \"6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab\": container with ID starting with 6a67165c56316cda176b945e2384c370029a9dafc5edbc3852a019dae067fbab not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.624183 4782 scope.go:117] "RemoveContainer" containerID="7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.624708 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\": container with ID starting with 7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98 not found: ID does not exist" containerID="7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.624773 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98"} err="failed to get container status \"7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\": rpc error: code = NotFound desc = could not find container \"7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98\": container with ID starting with 7082b06d356014f2c9e9c1e5a4dfcdee7714ae0945cc354118bd43737bfbae98 not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.624825 4782 scope.go:117] "RemoveContainer" containerID="f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.625399 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\": container with ID starting with f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9 not found: ID does not exist" containerID="f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.625437 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9"} err="failed to get container status \"f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\": rpc error: code = NotFound desc = could not find container \"f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9\": container with ID starting with f0618df508901bb36f5b2c773fa1faab62afd039d4995b4477a062f7f8b1f5d9 not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.625460 4782 scope.go:117] "RemoveContainer" containerID="1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.625825 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\": container with ID starting with 1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da not found: ID does not exist" containerID="1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.625873 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da"} err="failed to get container status \"1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\": rpc error: code = NotFound desc = could not find container \"1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da\": container with ID starting with 1bc04a2f8b777f367115b8066e64de6d7bc26e55b102d32bb5a765c3a57e77da not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.625906 4782 scope.go:117] "RemoveContainer" containerID="967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.626657 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\": container with ID starting with 967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007 not found: ID does not exist" containerID="967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.626689 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007"} err="failed to get container status \"967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\": rpc error: code = NotFound desc = could not find container \"967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007\": container with ID starting with 967322f6c4aa10ff655173c1082aa545f9fe119e164ad26505341851759ea007 not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.626710 4782 scope.go:117] "RemoveContainer" containerID="338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8" Jan 30 18:32:37 crc kubenswrapper[4782]: E0130 18:32:37.627068 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\": container with ID starting with 338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8 not found: ID does not exist" containerID="338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.627119 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8"} err="failed to get container status \"338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\": rpc error: code = NotFound desc = could not find container \"338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8\": container with ID starting with 338f363b05c975243d3f902e9fc1c17bc5930361a687c6d6c7e4ff8b14c0d7f8 not found: ID does not exist" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.814748 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:37 crc kubenswrapper[4782]: I0130 18:32:37.815407 4782 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:38 crc kubenswrapper[4782]: I0130 18:32:38.424858 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 18:32:40 crc kubenswrapper[4782]: E0130 18:32:40.036975 4782 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:40 crc kubenswrapper[4782]: I0130 18:32:40.038008 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:40 crc kubenswrapper[4782]: E0130 18:32:40.084943 4782 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f95de2f03d011 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 18:32:40.083951633 +0000 UTC m=+136.352329698,LastTimestamp:2026-01-30 18:32:40.083951633 +0000 UTC m=+136.352329698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 18:32:40 crc kubenswrapper[4782]: I0130 18:32:40.520500 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43"} Jan 30 18:32:40 crc kubenswrapper[4782]: I0130 18:32:40.520569 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2e14d93e2e74dea837e1ccac9d16a17819e560e0e9864181ee248aaf4a1ea531"} Jan 30 18:32:40 crc kubenswrapper[4782]: I0130 18:32:40.521306 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:40 crc kubenswrapper[4782]: E0130 18:32:40.521665 4782 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.501530 4782 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f95de2f03d011 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 18:32:40.083951633 +0000 UTC m=+136.352329698,LastTimestamp:2026-01-30 18:32:40.083951633 +0000 UTC m=+136.352329698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.624047 4782 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.624675 4782 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.625218 4782 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.625740 4782 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.626216 4782 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:43 crc kubenswrapper[4782]: I0130 18:32:43.626306 4782 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.626704 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Jan 30 18:32:43 crc kubenswrapper[4782]: E0130 18:32:43.828081 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Jan 30 18:32:44 crc kubenswrapper[4782]: E0130 18:32:44.229097 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Jan 30 18:32:44 crc kubenswrapper[4782]: I0130 18:32:44.415337 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:45 crc kubenswrapper[4782]: E0130 18:32:45.030420 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Jan 30 18:32:46 crc kubenswrapper[4782]: E0130 18:32:46.631743 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.410028 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.411140 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.424505 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.424702 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:48 crc kubenswrapper[4782]: E0130 18:32:48.425475 4782 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.426113 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:48 crc kubenswrapper[4782]: W0130 18:32:48.445113 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-4f8370439fe0c07f415bba0a37b4c0605d128e4530259dea5ad0243d860dae0f WatchSource:0}: Error finding container 4f8370439fe0c07f415bba0a37b4c0605d128e4530259dea5ad0243d860dae0f: Status 404 returned error can't find the container with id 4f8370439fe0c07f415bba0a37b4c0605d128e4530259dea5ad0243d860dae0f Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.582337 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.582416 4782 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ee770bfa5252c9d774f9f21d0f942eb6eeac2f5c8708cc9daa5afed482debec0" exitCode=1 Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.582473 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ee770bfa5252c9d774f9f21d0f942eb6eeac2f5c8708cc9daa5afed482debec0"} Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.583097 4782 scope.go:117] "RemoveContainer" containerID="ee770bfa5252c9d774f9f21d0f942eb6eeac2f5c8708cc9daa5afed482debec0" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.583642 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f8370439fe0c07f415bba0a37b4c0605d128e4530259dea5ad0243d860dae0f"} Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.583931 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:48 crc kubenswrapper[4782]: I0130 18:32:48.584898 4782 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.593492 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.593859 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0ef7d6f772f2caa927eb624cd46237584eb47e98e7ae4427c3f580a4787dab4"} Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.594671 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.594883 4782 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.596054 4782 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a7e300a6ff84bf77ba6a827088536894e901e776cd736f9dda581df5d76c9c6b" exitCode=0 Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.596093 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a7e300a6ff84bf77ba6a827088536894e901e776cd736f9dda581df5d76c9c6b"} Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.596578 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.596597 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:49 crc kubenswrapper[4782]: E0130 18:32:49.596870 4782 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.596906 4782 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.597082 4782 status_manager.go:851] "Failed to get status for pod" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.792546 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:32:49 crc kubenswrapper[4782]: I0130 18:32:49.792622 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:32:49 crc kubenswrapper[4782]: E0130 18:32:49.833887 4782 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="6.4s" Jan 30 18:32:50 crc kubenswrapper[4782]: I0130 18:32:50.603733 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"091db2d81f83097f55cfcd3e829911079fc27bec17a9bd2b1bcd17f1d4ab8113"} Jan 30 18:32:50 crc kubenswrapper[4782]: I0130 18:32:50.603910 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"88abe3e091486c5f5fedf5c3a7ffc4544c7aa1eb1a8f50ba09acfed48eed98a9"} Jan 30 18:32:50 crc kubenswrapper[4782]: I0130 18:32:50.603921 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"99d30688e1f5ad2e998a89ef84ad330fa96f87a0da6667be0b430714707df7d5"} Jan 30 18:32:51 crc kubenswrapper[4782]: I0130 18:32:51.613587 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f2f6fa29cab9eb481a234c0435d9ee128a570b0be20f7d404590842e284c7c26"} Jan 30 18:32:51 crc kubenswrapper[4782]: I0130 18:32:51.613647 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"435510b588deb587b723948803673664608ee12a9723f67c1c111b1c4f9e3444"} Jan 30 18:32:51 crc kubenswrapper[4782]: I0130 18:32:51.613889 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:51 crc kubenswrapper[4782]: I0130 18:32:51.614073 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:51 crc kubenswrapper[4782]: I0130 18:32:51.614110 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:53 crc kubenswrapper[4782]: I0130 18:32:53.426888 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:53 crc kubenswrapper[4782]: I0130 18:32:53.426959 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:53 crc kubenswrapper[4782]: I0130 18:32:53.432607 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.455782 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.464873 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.624657 4782 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.657890 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.657967 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.663149 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.665854 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:32:56 crc kubenswrapper[4782]: I0130 18:32:56.699211 4782 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="33d21bd7-4e50-4178-9c18-6ea143e82fd4" Jan 30 18:32:57 crc kubenswrapper[4782]: I0130 18:32:57.664820 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:57 crc kubenswrapper[4782]: I0130 18:32:57.665797 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:32:57 crc kubenswrapper[4782]: I0130 18:32:57.669271 4782 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="33d21bd7-4e50-4178-9c18-6ea143e82fd4" Jan 30 18:33:06 crc kubenswrapper[4782]: I0130 18:33:06.850190 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 18:33:07 crc kubenswrapper[4782]: I0130 18:33:07.116933 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 18:33:07 crc kubenswrapper[4782]: I0130 18:33:07.449399 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 18:33:07 crc kubenswrapper[4782]: I0130 18:33:07.745752 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 18:33:07 crc kubenswrapper[4782]: I0130 18:33:07.857901 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 18:33:07 crc kubenswrapper[4782]: I0130 18:33:07.928122 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.143586 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.268805 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.409119 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.609766 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.609860 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.678687 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.702810 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.773393 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 18:33:08 crc kubenswrapper[4782]: I0130 18:33:08.914844 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.044893 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.158407 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.222016 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.300959 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.324345 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.332535 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.423069 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.556660 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.592570 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.597423 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.614282 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.758481 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.797497 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.917656 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 18:33:09 crc kubenswrapper[4782]: I0130 18:33:09.925817 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.122053 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.161850 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.212891 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.287707 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.320166 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.326901 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.370453 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.447439 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.522725 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.560273 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.593816 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.731112 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.731401 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.777973 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.786544 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.809006 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.846932 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.892917 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.935239 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.973614 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 18:33:10 crc kubenswrapper[4782]: I0130 18:33:10.978984 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.081168 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.190923 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.264807 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.476258 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.531576 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.555987 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.725684 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.783741 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.894310 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 18:33:11 crc kubenswrapper[4782]: I0130 18:33:11.960301 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.007806 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.026914 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.108857 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.124983 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.227773 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.383360 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.498029 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.560440 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.600214 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.622605 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.697500 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.732713 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.853153 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 18:33:12 crc kubenswrapper[4782]: I0130 18:33:12.932263 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.047109 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.114050 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.125185 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.251324 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.258791 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.293763 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.314129 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.340419 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.364161 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.491979 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.517418 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.533928 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.545529 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.548555 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.567154 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.639582 4782 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.641319 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.667873 4782 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.761990 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.763845 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.764082 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.805395 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.814352 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.820388 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.838512 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.854215 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.890419 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.925528 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.944834 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 18:33:13 crc kubenswrapper[4782]: I0130 18:33:13.989216 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.000209 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.032845 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.132746 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.205252 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.256950 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.279751 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.464910 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.485311 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.544944 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.560192 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.565926 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.566820 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.660069 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.661704 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.679725 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.753097 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.755365 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.853665 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.854908 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.904857 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 18:33:14 crc kubenswrapper[4782]: I0130 18:33:14.935519 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.054977 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.063397 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.070722 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.205614 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.256618 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.336172 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.336286 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.387187 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.573867 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.598403 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.682432 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.729839 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.740726 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.747696 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.802084 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.812568 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.833950 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 18:33:15 crc kubenswrapper[4782]: I0130 18:33:15.989698 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.027563 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.033863 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.042147 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.051083 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.205991 4782 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213198 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213304 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm","openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 18:33:16 crc kubenswrapper[4782]: E0130 18:33:16.213581 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" containerName="installer" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213610 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" containerName="installer" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213783 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9c4fd4-7593-46e0-b333-bf967f2cc1a2" containerName="installer" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213874 4782 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.213918 4782 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e5f5b978-22ce-4ef1-8792-a3e12ba1af5c" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.214470 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.216435 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.217144 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.217489 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.219011 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.220762 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.221386 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.221855 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.222370 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.223044 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.223263 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.223765 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.223944 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.226149 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.226358 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.234352 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.238764 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.246863 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.260191 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.260161562 podStartE2EDuration="20.260161562s" podCreationTimestamp="2026-01-30 18:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:33:16.251882393 +0000 UTC m=+172.520260478" watchObservedRunningTime="2026-01-30 18:33:16.260161562 +0000 UTC m=+172.528539617" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.391180 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.391593 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409168 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409301 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-dir\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409340 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-policies\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409406 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409439 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lbd\" (UniqueName: \"kubernetes.io/projected/26cb38a3-adb8-4eee-9b09-54294856fd34-kube-api-access-68lbd\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409531 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409564 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409623 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409689 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409729 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409748 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409791 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.409819 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.417821 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.420428 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.451054 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.467299 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.493580 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510639 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510698 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510755 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510825 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510858 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510884 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.510927 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511018 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511059 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-dir\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511092 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-policies\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511124 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511157 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lbd\" (UniqueName: \"kubernetes.io/projected/26cb38a3-adb8-4eee-9b09-54294856fd34-kube-api-access-68lbd\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511190 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.511794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-dir\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.513367 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.513417 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.513602 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-audit-policies\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.514218 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.517571 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-error\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.518662 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.519191 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-session\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.525105 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.526540 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-login\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.527068 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.531540 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.532421 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lbd\" (UniqueName: \"kubernetes.io/projected/26cb38a3-adb8-4eee-9b09-54294856fd34-kube-api-access-68lbd\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.532793 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/26cb38a3-adb8-4eee-9b09-54294856fd34-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bcd9fbb-mvwfm\" (UID: \"26cb38a3-adb8-4eee-9b09-54294856fd34\") " pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.549680 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.641588 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.707406 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.767660 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.779536 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.779681 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.780213 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.947062 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 18:33:16 crc kubenswrapper[4782]: I0130 18:33:16.983605 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.001970 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.014593 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.014726 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.062707 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.493977 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.536273 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.576890 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.595057 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.628667 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.680460 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.720288 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.725383 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.746141 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.799113 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.820408 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.881488 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 18:33:17 crc kubenswrapper[4782]: I0130 18:33:17.894652 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.064953 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.083677 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.219863 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.367641 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.444925 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.483726 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.517139 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.822139 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.844549 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 18:33:18 crc kubenswrapper[4782]: I0130 18:33:18.996526 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.032802 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.131183 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.152173 4782 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.152501 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43" gracePeriod=5 Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.184328 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.232907 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.262439 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.414691 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.579547 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.590971 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.597700 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.620693 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.637891 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.691556 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.792800 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.792884 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.796134 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 18:33:19 crc kubenswrapper[4782]: I0130 18:33:19.802151 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 18:33:19 crc kubenswrapper[4782]: E0130 18:33:19.924303 4782 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 30 18:33:19 crc kubenswrapper[4782]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication_26cb38a3-adb8-4eee-9b09-54294856fd34_0(6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad): error adding pod openshift-authentication_oauth-openshift-57bcd9fbb-mvwfm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad" Netns:"/var/run/netns/86c143bb-cd49-41eb-a274-e2d21d6c02db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-57bcd9fbb-mvwfm;K8S_POD_INFRA_CONTAINER_ID=6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad;K8S_POD_UID=26cb38a3-adb8-4eee-9b09-54294856fd34" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm] networking: Multus: [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm/26cb38a3-adb8-4eee-9b09-54294856fd34]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-57bcd9fbb-mvwfm in out of cluster comm: pod "oauth-openshift-57bcd9fbb-mvwfm" not found Jan 30 18:33:19 crc kubenswrapper[4782]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 18:33:19 crc kubenswrapper[4782]: > Jan 30 18:33:19 crc kubenswrapper[4782]: E0130 18:33:19.924395 4782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 30 18:33:19 crc kubenswrapper[4782]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication_26cb38a3-adb8-4eee-9b09-54294856fd34_0(6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad): error adding pod openshift-authentication_oauth-openshift-57bcd9fbb-mvwfm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad" Netns:"/var/run/netns/86c143bb-cd49-41eb-a274-e2d21d6c02db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-57bcd9fbb-mvwfm;K8S_POD_INFRA_CONTAINER_ID=6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad;K8S_POD_UID=26cb38a3-adb8-4eee-9b09-54294856fd34" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm] networking: Multus: [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm/26cb38a3-adb8-4eee-9b09-54294856fd34]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-57bcd9fbb-mvwfm in out of cluster comm: pod "oauth-openshift-57bcd9fbb-mvwfm" not found Jan 30 18:33:19 crc kubenswrapper[4782]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 18:33:19 crc kubenswrapper[4782]: > pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:19 crc kubenswrapper[4782]: E0130 18:33:19.924420 4782 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Jan 30 18:33:19 crc kubenswrapper[4782]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication_26cb38a3-adb8-4eee-9b09-54294856fd34_0(6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad): error adding pod openshift-authentication_oauth-openshift-57bcd9fbb-mvwfm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad" Netns:"/var/run/netns/86c143bb-cd49-41eb-a274-e2d21d6c02db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-57bcd9fbb-mvwfm;K8S_POD_INFRA_CONTAINER_ID=6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad;K8S_POD_UID=26cb38a3-adb8-4eee-9b09-54294856fd34" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm] networking: Multus: [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm/26cb38a3-adb8-4eee-9b09-54294856fd34]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-57bcd9fbb-mvwfm in out of cluster comm: pod "oauth-openshift-57bcd9fbb-mvwfm" not found Jan 30 18:33:19 crc kubenswrapper[4782]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 30 18:33:19 crc kubenswrapper[4782]: > pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:19 crc kubenswrapper[4782]: E0130 18:33:19.924490 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication(26cb38a3-adb8-4eee-9b09-54294856fd34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication(26cb38a3-adb8-4eee-9b09-54294856fd34)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-57bcd9fbb-mvwfm_openshift-authentication_26cb38a3-adb8-4eee-9b09-54294856fd34_0(6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad): error adding pod openshift-authentication_oauth-openshift-57bcd9fbb-mvwfm to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad\\\" Netns:\\\"/var/run/netns/86c143bb-cd49-41eb-a274-e2d21d6c02db\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-57bcd9fbb-mvwfm;K8S_POD_INFRA_CONTAINER_ID=6928aa418494fe6d5d6eb3f24114c3fd3bdd33eed8162f449cc0d5f9b46356ad;K8S_POD_UID=26cb38a3-adb8-4eee-9b09-54294856fd34\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm] networking: Multus: [openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm/26cb38a3-adb8-4eee-9b09-54294856fd34]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-57bcd9fbb-mvwfm in out of cluster comm: pod \\\"oauth-openshift-57bcd9fbb-mvwfm\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" podUID="26cb38a3-adb8-4eee-9b09-54294856fd34" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.018054 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.126898 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.200931 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.263529 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.373109 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.427538 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.552399 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.714045 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.896513 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.916975 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 18:33:20 crc kubenswrapper[4782]: I0130 18:33:20.939375 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.023603 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.052087 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.067720 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.128387 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.244025 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.262385 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.302508 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.383075 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.605290 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.631424 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.673693 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 18:33:21 crc kubenswrapper[4782]: I0130 18:33:21.848791 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 18:33:22 crc kubenswrapper[4782]: I0130 18:33:22.098828 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 18:33:22 crc kubenswrapper[4782]: I0130 18:33:22.258443 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 18:33:22 crc kubenswrapper[4782]: I0130 18:33:22.759171 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 18:33:23 crc kubenswrapper[4782]: I0130 18:33:23.779903 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 18:33:23 crc kubenswrapper[4782]: I0130 18:33:23.977466 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.546939 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.756702 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.757194 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.857315 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.857371 4782 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43" exitCode=137 Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.857417 4782 scope.go:117] "RemoveContainer" containerID="34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.857562 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.883540 4782 scope.go:117] "RemoveContainer" containerID="34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43" Jan 30 18:33:24 crc kubenswrapper[4782]: E0130 18:33:24.884391 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43\": container with ID starting with 34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43 not found: ID does not exist" containerID="34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.884491 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43"} err="failed to get container status \"34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43\": rpc error: code = NotFound desc = could not find container \"34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43\": container with ID starting with 34a35e840970be549ecec3748831421c523f54cd6c0f419589fc7518ee6a0d43 not found: ID does not exist" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933745 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933840 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933865 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933931 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933950 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.933964 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934033 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934092 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934223 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934592 4782 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934616 4782 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934635 4782 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.934652 4782 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 18:33:24 crc kubenswrapper[4782]: I0130 18:33:24.948224 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:33:25 crc kubenswrapper[4782]: I0130 18:33:25.035742 4782 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 18:33:26 crc kubenswrapper[4782]: I0130 18:33:26.423359 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 18:33:31 crc kubenswrapper[4782]: I0130 18:33:31.410697 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:31 crc kubenswrapper[4782]: I0130 18:33:31.413878 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:31 crc kubenswrapper[4782]: I0130 18:33:31.895845 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm"] Jan 30 18:33:31 crc kubenswrapper[4782]: I0130 18:33:31.915783 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" event={"ID":"26cb38a3-adb8-4eee-9b09-54294856fd34","Type":"ContainerStarted","Data":"bcb51eb33f36977ce07ff62bb226b27cdf898db576ab10c2c9d15e9339520b32"} Jan 30 18:33:32 crc kubenswrapper[4782]: I0130 18:33:32.924465 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" event={"ID":"26cb38a3-adb8-4eee-9b09-54294856fd34","Type":"ContainerStarted","Data":"f1e16e39c14249bfe7b53dd70955ab10c108504594e5d8d6abf8968e79fb970f"} Jan 30 18:33:32 crc kubenswrapper[4782]: I0130 18:33:32.924839 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:32 crc kubenswrapper[4782]: I0130 18:33:32.932500 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" Jan 30 18:33:32 crc kubenswrapper[4782]: I0130 18:33:32.952317 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57bcd9fbb-mvwfm" podStartSLOduration=90.952295126 podStartE2EDuration="1m30.952295126s" podCreationTimestamp="2026-01-30 18:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:33:32.948829745 +0000 UTC m=+189.217207770" watchObservedRunningTime="2026-01-30 18:33:32.952295126 +0000 UTC m=+189.220673151" Jan 30 18:33:39 crc kubenswrapper[4782]: I0130 18:33:39.977332 4782 generic.go:334] "Generic (PLEG): container finished" podID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerID="0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a" exitCode=0 Jan 30 18:33:39 crc kubenswrapper[4782]: I0130 18:33:39.977520 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerDied","Data":"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a"} Jan 30 18:33:39 crc kubenswrapper[4782]: I0130 18:33:39.978875 4782 scope.go:117] "RemoveContainer" containerID="0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a" Jan 30 18:33:40 crc kubenswrapper[4782]: I0130 18:33:40.013997 4782 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 18:33:40 crc kubenswrapper[4782]: I0130 18:33:40.991615 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerStarted","Data":"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8"} Jan 30 18:33:40 crc kubenswrapper[4782]: I0130 18:33:40.993144 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:33:40 crc kubenswrapper[4782]: I0130 18:33:40.996380 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:33:47 crc kubenswrapper[4782]: I0130 18:33:47.824727 4782 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 18:33:49 crc kubenswrapper[4782]: I0130 18:33:49.793326 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:33:49 crc kubenswrapper[4782]: I0130 18:33:49.793631 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:33:49 crc kubenswrapper[4782]: I0130 18:33:49.793692 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:33:49 crc kubenswrapper[4782]: I0130 18:33:49.794381 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:33:49 crc kubenswrapper[4782]: I0130 18:33:49.794548 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267" gracePeriod=600 Jan 30 18:33:50 crc kubenswrapper[4782]: I0130 18:33:50.052595 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267" exitCode=0 Jan 30 18:33:50 crc kubenswrapper[4782]: I0130 18:33:50.052650 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267"} Jan 30 18:33:51 crc kubenswrapper[4782]: I0130 18:33:51.067710 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688"} Jan 30 18:33:51 crc kubenswrapper[4782]: I0130 18:33:51.530948 4782 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.288467 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jg4gn"] Jan 30 18:34:58 crc kubenswrapper[4782]: E0130 18:34:58.289388 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.289404 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.289512 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.289945 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.307588 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jg4gn"] Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.440972 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441034 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-456jm\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-kube-api-access-456jm\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441081 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441114 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-certificates\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441166 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-bound-sa-token\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441211 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-tls\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441286 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.441326 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-trusted-ca\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.466738 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.542523 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-trusted-ca\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.542934 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-456jm\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-kube-api-access-456jm\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.543867 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.545051 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-certificates\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.545264 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-bound-sa-token\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.545415 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-tls\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.545537 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.543989 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-trusted-ca\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.546030 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.547041 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-certificates\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.556355 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.558988 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-registry-tls\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.566710 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-bound-sa-token\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.571541 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-456jm\" (UniqueName: \"kubernetes.io/projected/6e9a486d-3568-4431-aa4f-0c8bcee0b1e8-kube-api-access-456jm\") pod \"image-registry-66df7c8f76-jg4gn\" (UID: \"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8\") " pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:58 crc kubenswrapper[4782]: I0130 18:34:58.616664 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:59 crc kubenswrapper[4782]: I0130 18:34:59.043726 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jg4gn"] Jan 30 18:34:59 crc kubenswrapper[4782]: I0130 18:34:59.516725 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" event={"ID":"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8","Type":"ContainerStarted","Data":"62ab668de14c6984eaea9e5fecaabdf0bd180c9ebc24b33a59a185d77869cf7a"} Jan 30 18:34:59 crc kubenswrapper[4782]: I0130 18:34:59.516787 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" event={"ID":"6e9a486d-3568-4431-aa4f-0c8bcee0b1e8","Type":"ContainerStarted","Data":"826120eaa139e32737c42c1ace5ddc9aff81b5bf85cb0a986de863da4dba8733"} Jan 30 18:34:59 crc kubenswrapper[4782]: I0130 18:34:59.516925 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:34:59 crc kubenswrapper[4782]: I0130 18:34:59.551994 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" podStartSLOduration=1.5519619850000002 podStartE2EDuration="1.551961985s" podCreationTimestamp="2026-01-30 18:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:34:59.544820704 +0000 UTC m=+275.813198769" watchObservedRunningTime="2026-01-30 18:34:59.551961985 +0000 UTC m=+275.820340060" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.509820 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.511321 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ddhqf" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="registry-server" containerID="cri-o://6b501a8d7f132473fd6c980dd9738f0d34b2d61b6a6a96dae6cb18cd28ae981a" gracePeriod=30 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.530997 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.531455 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qg5cd" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="registry-server" containerID="cri-o://3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5" gracePeriod=30 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.539918 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.540396 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" containerID="cri-o://d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8" gracePeriod=30 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.547786 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.548175 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnbhn" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="registry-server" containerID="cri-o://f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a" gracePeriod=30 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.552431 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.552700 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l4jgx" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="registry-server" containerID="cri-o://a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96" gracePeriod=30 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.563873 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qsfzf"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.565866 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.585965 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qsfzf"] Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.634358 4782 generic.go:334] "Generic (PLEG): container finished" podID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerID="6b501a8d7f132473fd6c980dd9738f0d34b2d61b6a6a96dae6cb18cd28ae981a" exitCode=0 Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.634411 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerDied","Data":"6b501a8d7f132473fd6c980dd9738f0d34b2d61b6a6a96dae6cb18cd28ae981a"} Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.710257 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.710308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.710354 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvkct\" (UniqueName: \"kubernetes.io/projected/5d993e9b-840e-4235-9d1e-9d2cf1928afc-kube-api-access-xvkct\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.811668 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvkct\" (UniqueName: \"kubernetes.io/projected/5d993e9b-840e-4235-9d1e-9d2cf1928afc-kube-api-access-xvkct\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.811745 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.811773 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.813456 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.820909 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d993e9b-840e-4235-9d1e-9d2cf1928afc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.826883 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvkct\" (UniqueName: \"kubernetes.io/projected/5d993e9b-840e-4235-9d1e-9d2cf1928afc-kube-api-access-xvkct\") pod \"marketplace-operator-79b997595-qsfzf\" (UID: \"5d993e9b-840e-4235-9d1e-9d2cf1928afc\") " pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.869632 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.885189 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.947652 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.952135 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:35:15 crc kubenswrapper[4782]: I0130 18:35:15.966519 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.004069 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.015279 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities\") pod \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.015413 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities\") pod \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.015517 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9zwk\" (UniqueName: \"kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk\") pod \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.016211 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities" (OuterVolumeSpecName: "utilities") pod "0e13d1a3-9ea0-470c-8e34-c935718e7fcf" (UID: "0e13d1a3-9ea0-470c-8e34-c935718e7fcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.020467 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content\") pod \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.020513 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj898\" (UniqueName: \"kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898\") pod \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\" (UID: \"b84c2e64-bb1c-40ea-a369-55cce87dc7d7\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.020538 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content\") pod \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\" (UID: \"0e13d1a3-9ea0-470c-8e34-c935718e7fcf\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.020909 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.025604 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898" (OuterVolumeSpecName: "kube-api-access-sj898") pod "b84c2e64-bb1c-40ea-a369-55cce87dc7d7" (UID: "b84c2e64-bb1c-40ea-a369-55cce87dc7d7"). InnerVolumeSpecName "kube-api-access-sj898". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.025844 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities" (OuterVolumeSpecName: "utilities") pod "b84c2e64-bb1c-40ea-a369-55cce87dc7d7" (UID: "b84c2e64-bb1c-40ea-a369-55cce87dc7d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.027495 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk" (OuterVolumeSpecName: "kube-api-access-d9zwk") pod "0e13d1a3-9ea0-470c-8e34-c935718e7fcf" (UID: "0e13d1a3-9ea0-470c-8e34-c935718e7fcf"). InnerVolumeSpecName "kube-api-access-d9zwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.076455 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e13d1a3-9ea0-470c-8e34-c935718e7fcf" (UID: "0e13d1a3-9ea0-470c-8e34-c935718e7fcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.085594 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b84c2e64-bb1c-40ea-a369-55cce87dc7d7" (UID: "b84c2e64-bb1c-40ea-a369-55cce87dc7d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122594 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h679v\" (UniqueName: \"kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v\") pod \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122653 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities\") pod \"913d2663-2aea-4ac0-98bc-eb817aee0f98\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122719 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content\") pod \"2eeba928-9384-4789-b6d2-dbc557b815d5\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122745 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmgng\" (UniqueName: \"kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng\") pod \"913d2663-2aea-4ac0-98bc-eb817aee0f98\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122774 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca\") pod \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122794 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities\") pod \"2eeba928-9384-4789-b6d2-dbc557b815d5\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122834 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content\") pod \"913d2663-2aea-4ac0-98bc-eb817aee0f98\" (UID: \"913d2663-2aea-4ac0-98bc-eb817aee0f98\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122854 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics\") pod \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\" (UID: \"8ef2e5ab-f6a0-4735-a5c0-0838345128fb\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.122875 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bhm\" (UniqueName: \"kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm\") pod \"2eeba928-9384-4789-b6d2-dbc557b815d5\" (UID: \"2eeba928-9384-4789-b6d2-dbc557b815d5\") " Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.123058 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.123069 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj898\" (UniqueName: \"kubernetes.io/projected/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-kube-api-access-sj898\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.123081 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.123090 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b84c2e64-bb1c-40ea-a369-55cce87dc7d7-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.123098 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9zwk\" (UniqueName: \"kubernetes.io/projected/0e13d1a3-9ea0-470c-8e34-c935718e7fcf-kube-api-access-d9zwk\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.125241 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities" (OuterVolumeSpecName: "utilities") pod "913d2663-2aea-4ac0-98bc-eb817aee0f98" (UID: "913d2663-2aea-4ac0-98bc-eb817aee0f98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.126258 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qsfzf"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.126820 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm" (OuterVolumeSpecName: "kube-api-access-h6bhm") pod "2eeba928-9384-4789-b6d2-dbc557b815d5" (UID: "2eeba928-9384-4789-b6d2-dbc557b815d5"). InnerVolumeSpecName "kube-api-access-h6bhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.127261 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities" (OuterVolumeSpecName: "utilities") pod "2eeba928-9384-4789-b6d2-dbc557b815d5" (UID: "2eeba928-9384-4789-b6d2-dbc557b815d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.127633 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8ef2e5ab-f6a0-4735-a5c0-0838345128fb" (UID: "8ef2e5ab-f6a0-4735-a5c0-0838345128fb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.129366 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng" (OuterVolumeSpecName: "kube-api-access-xmgng") pod "913d2663-2aea-4ac0-98bc-eb817aee0f98" (UID: "913d2663-2aea-4ac0-98bc-eb817aee0f98"). InnerVolumeSpecName "kube-api-access-xmgng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.130022 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8ef2e5ab-f6a0-4735-a5c0-0838345128fb" (UID: "8ef2e5ab-f6a0-4735-a5c0-0838345128fb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: W0130 18:35:16.130462 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d993e9b_840e_4235_9d1e_9d2cf1928afc.slice/crio-8451d6bb6e6e3eb83023a910d35ad8235a17be384f5b416d79012abe0c50f4eb WatchSource:0}: Error finding container 8451d6bb6e6e3eb83023a910d35ad8235a17be384f5b416d79012abe0c50f4eb: Status 404 returned error can't find the container with id 8451d6bb6e6e3eb83023a910d35ad8235a17be384f5b416d79012abe0c50f4eb Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.131443 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v" (OuterVolumeSpecName: "kube-api-access-h679v") pod "8ef2e5ab-f6a0-4735-a5c0-0838345128fb" (UID: "8ef2e5ab-f6a0-4735-a5c0-0838345128fb"). InnerVolumeSpecName "kube-api-access-h679v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.155443 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913d2663-2aea-4ac0-98bc-eb817aee0f98" (UID: "913d2663-2aea-4ac0-98bc-eb817aee0f98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224495 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224525 4782 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224535 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bhm\" (UniqueName: \"kubernetes.io/projected/2eeba928-9384-4789-b6d2-dbc557b815d5-kube-api-access-h6bhm\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224544 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h679v\" (UniqueName: \"kubernetes.io/projected/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-kube-api-access-h679v\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224553 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913d2663-2aea-4ac0-98bc-eb817aee0f98-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224562 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmgng\" (UniqueName: \"kubernetes.io/projected/913d2663-2aea-4ac0-98bc-eb817aee0f98-kube-api-access-xmgng\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224571 4782 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ef2e5ab-f6a0-4735-a5c0-0838345128fb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.224580 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.249374 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eeba928-9384-4789-b6d2-dbc557b815d5" (UID: "2eeba928-9384-4789-b6d2-dbc557b815d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.325956 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeba928-9384-4789-b6d2-dbc557b815d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.641329 4782 generic.go:334] "Generic (PLEG): container finished" podID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerID="d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8" exitCode=0 Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.641404 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.641426 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerDied","Data":"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.641831 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-46dsj" event={"ID":"8ef2e5ab-f6a0-4735-a5c0-0838345128fb","Type":"ContainerDied","Data":"19b0128bdcbd7bf3b00278907e2c933e0b328d5fd135eb748e767f048e24b320"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.641850 4782 scope.go:117] "RemoveContainer" containerID="d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.643594 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" event={"ID":"5d993e9b-840e-4235-9d1e-9d2cf1928afc","Type":"ContainerStarted","Data":"ff00cf132d4c391ff2ca748cc817b77c9f6d6d302ea4c61e566ce6754f1067b1"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.643632 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" event={"ID":"5d993e9b-840e-4235-9d1e-9d2cf1928afc","Type":"ContainerStarted","Data":"8451d6bb6e6e3eb83023a910d35ad8235a17be384f5b416d79012abe0c50f4eb"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.644969 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.647339 4782 generic.go:334] "Generic (PLEG): container finished" podID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerID="3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5" exitCode=0 Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.647374 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerDied","Data":"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.647408 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qg5cd" event={"ID":"0e13d1a3-9ea0-470c-8e34-c935718e7fcf","Type":"ContainerDied","Data":"042d59e0f4d6c7ea46dc1f11088918ba92cd654fabafbadef94e8fe20a503b49"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.647412 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qg5cd" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.650221 4782 generic.go:334] "Generic (PLEG): container finished" podID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerID="f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a" exitCode=0 Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.650306 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerDied","Data":"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.650332 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbhn" event={"ID":"913d2663-2aea-4ac0-98bc-eb817aee0f98","Type":"ContainerDied","Data":"1367e56f51236d78f50014cbe79e12ab7077ff234a6c63cbe9c92d6d1fe85d17"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.650340 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbhn" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.652672 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ddhqf" event={"ID":"b84c2e64-bb1c-40ea-a369-55cce87dc7d7","Type":"ContainerDied","Data":"c8b8b1c1a70b9bdbbab003fb41fcd931ea762e883bbdadcf416d2fbdae24d833"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.652713 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ddhqf" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.659384 4782 scope.go:117] "RemoveContainer" containerID="0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.659753 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.663645 4782 generic.go:334] "Generic (PLEG): container finished" podID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerID="a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96" exitCode=0 Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.663688 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerDied","Data":"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.663713 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4jgx" event={"ID":"2eeba928-9384-4789-b6d2-dbc557b815d5","Type":"ContainerDied","Data":"ea6f3270a846f64f98c9041c3bafb76ae9a174c49e52557d916afb1636721dec"} Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.663813 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4jgx" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.667522 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" podStartSLOduration=1.667509361 podStartE2EDuration="1.667509361s" podCreationTimestamp="2026-01-30 18:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:35:16.663055478 +0000 UTC m=+292.931433503" watchObservedRunningTime="2026-01-30 18:35:16.667509361 +0000 UTC m=+292.935887386" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.704220 4782 scope.go:117] "RemoveContainer" containerID="d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.704738 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8\": container with ID starting with d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8 not found: ID does not exist" containerID="d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.704775 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8"} err="failed to get container status \"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8\": rpc error: code = NotFound desc = could not find container \"d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8\": container with ID starting with d7127f8f4680c941df32c72c50f2816f8a0089d91d9a1927bf58d01fd7cf98d8 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.704803 4782 scope.go:117] "RemoveContainer" containerID="0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.705282 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a\": container with ID starting with 0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a not found: ID does not exist" containerID="0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.705309 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a"} err="failed to get container status \"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a\": rpc error: code = NotFound desc = could not find container \"0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a\": container with ID starting with 0c2188dbebd2b09f7742be2c8df8b7faa1f1bbd3bb2b46fdeb40779526e4924a not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.705327 4782 scope.go:117] "RemoveContainer" containerID="3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.708180 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.714658 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qg5cd"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.731379 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.738350 4782 scope.go:117] "RemoveContainer" containerID="0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.740967 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-46dsj"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.748247 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.754006 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbhn"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.757358 4782 scope.go:117] "RemoveContainer" containerID="3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.759379 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.763684 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l4jgx"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.767191 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.769870 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ddhqf"] Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.779047 4782 scope.go:117] "RemoveContainer" containerID="3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.779407 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5\": container with ID starting with 3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5 not found: ID does not exist" containerID="3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.779445 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5"} err="failed to get container status \"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5\": rpc error: code = NotFound desc = could not find container \"3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5\": container with ID starting with 3f7ee02c00f5e3a3691c286c7fcccd9d70e564a7618c11bd11b1a40b71c139d5 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.779477 4782 scope.go:117] "RemoveContainer" containerID="0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.779772 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8\": container with ID starting with 0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8 not found: ID does not exist" containerID="0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.779805 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8"} err="failed to get container status \"0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8\": rpc error: code = NotFound desc = could not find container \"0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8\": container with ID starting with 0ff048ae3a7917d2894b3944073e077d26f885049e78b3efd311af801c3a6cd8 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.779849 4782 scope.go:117] "RemoveContainer" containerID="3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.780050 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80\": container with ID starting with 3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80 not found: ID does not exist" containerID="3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.780076 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80"} err="failed to get container status \"3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80\": rpc error: code = NotFound desc = could not find container \"3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80\": container with ID starting with 3bd87479b33fb773f52a3c98858e05162dc643a7114085e723f252a40d572a80 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.780093 4782 scope.go:117] "RemoveContainer" containerID="f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.791570 4782 scope.go:117] "RemoveContainer" containerID="5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.802889 4782 scope.go:117] "RemoveContainer" containerID="bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.816520 4782 scope.go:117] "RemoveContainer" containerID="f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.816865 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a\": container with ID starting with f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a not found: ID does not exist" containerID="f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.816904 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a"} err="failed to get container status \"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a\": rpc error: code = NotFound desc = could not find container \"f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a\": container with ID starting with f537ac9f935d01b1970a0d2251fc20be8233f57c68e37196e5fb40716dde3e2a not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.816936 4782 scope.go:117] "RemoveContainer" containerID="5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.817172 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514\": container with ID starting with 5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514 not found: ID does not exist" containerID="5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.817196 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514"} err="failed to get container status \"5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514\": rpc error: code = NotFound desc = could not find container \"5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514\": container with ID starting with 5df7b062f7fc4e063371b7b0bfdb1b5a20b709b51dc5c9c930b4863a2aff9514 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.817212 4782 scope.go:117] "RemoveContainer" containerID="bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.817896 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2\": container with ID starting with bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2 not found: ID does not exist" containerID="bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.817921 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2"} err="failed to get container status \"bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2\": rpc error: code = NotFound desc = could not find container \"bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2\": container with ID starting with bf0772247b4a663871977e79cdfcb5ee2033cf8a0123a1d2ad91cf1398ac43f2 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.817940 4782 scope.go:117] "RemoveContainer" containerID="6b501a8d7f132473fd6c980dd9738f0d34b2d61b6a6a96dae6cb18cd28ae981a" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.830106 4782 scope.go:117] "RemoveContainer" containerID="32e8b499e61c0a107c3a942378bdfbbbdb9d422b3e24b3475724de876888ee99" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.840624 4782 scope.go:117] "RemoveContainer" containerID="60da5b1aad0690940671038f93bcbcd97b3abdcedcf1391c5e3a5d9bbc77f064" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.855316 4782 scope.go:117] "RemoveContainer" containerID="a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.870201 4782 scope.go:117] "RemoveContainer" containerID="fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.883323 4782 scope.go:117] "RemoveContainer" containerID="09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.894575 4782 scope.go:117] "RemoveContainer" containerID="a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.894988 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96\": container with ID starting with a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96 not found: ID does not exist" containerID="a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.895031 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96"} err="failed to get container status \"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96\": rpc error: code = NotFound desc = could not find container \"a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96\": container with ID starting with a47dd53c639da320965927d4db5a1142a6f0f200d461a7686b7d1e4d1821ba96 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.895060 4782 scope.go:117] "RemoveContainer" containerID="fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.895387 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745\": container with ID starting with fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745 not found: ID does not exist" containerID="fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.895442 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745"} err="failed to get container status \"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745\": rpc error: code = NotFound desc = could not find container \"fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745\": container with ID starting with fe7af5b9ca2b7083da8f30fe81c730635647e3829e40a0a4faad87c611d33745 not found: ID does not exist" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.895468 4782 scope.go:117] "RemoveContainer" containerID="09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529" Jan 30 18:35:16 crc kubenswrapper[4782]: E0130 18:35:16.895700 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529\": container with ID starting with 09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529 not found: ID does not exist" containerID="09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529" Jan 30 18:35:16 crc kubenswrapper[4782]: I0130 18:35:16.895731 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529"} err="failed to get container status \"09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529\": rpc error: code = NotFound desc = could not find container \"09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529\": container with ID starting with 09e25c03a326e9649a0d7249d0d7904f0d8b336da4b83cabd8341c06f1836529 not found: ID does not exist" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741128 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvlds"] Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741605 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741642 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741835 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741851 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741874 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741893 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741914 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741930 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741956 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.741972 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.741990 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742006 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742024 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742041 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742064 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742083 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742105 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742122 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742143 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742159 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742177 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742193 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742212 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742260 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="extract-utilities" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742279 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742296 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: E0130 18:35:17.742322 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742341 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="extract-content" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742550 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742578 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742600 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742626 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742644 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" containerName="registry-server" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.742667 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" containerName="marketplace-operator" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.744299 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.748207 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.751714 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvlds"] Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.849892 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vmq\" (UniqueName: \"kubernetes.io/projected/858d4185-257a-4486-8147-63381dd9a8f6-kube-api-access-s8vmq\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.849970 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-catalog-content\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.850026 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-utilities\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.934056 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t6s5g"] Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.935215 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.938800 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.943983 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6s5g"] Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.950942 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-utilities\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.951027 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vmq\" (UniqueName: \"kubernetes.io/projected/858d4185-257a-4486-8147-63381dd9a8f6-kube-api-access-s8vmq\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.951064 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-catalog-content\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.951429 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-catalog-content\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.951638 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858d4185-257a-4486-8147-63381dd9a8f6-utilities\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:17 crc kubenswrapper[4782]: I0130 18:35:17.978851 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vmq\" (UniqueName: \"kubernetes.io/projected/858d4185-257a-4486-8147-63381dd9a8f6-kube-api-access-s8vmq\") pod \"certified-operators-jvlds\" (UID: \"858d4185-257a-4486-8147-63381dd9a8f6\") " pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.052573 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-utilities\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.052622 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn46b\" (UniqueName: \"kubernetes.io/projected/b8269608-e848-472d-a953-8d8b3a3418e2-kube-api-access-nn46b\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.052843 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-catalog-content\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.062644 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.154190 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-catalog-content\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.154252 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-utilities\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.154281 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn46b\" (UniqueName: \"kubernetes.io/projected/b8269608-e848-472d-a953-8d8b3a3418e2-kube-api-access-nn46b\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.155125 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-utilities\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.155370 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8269608-e848-472d-a953-8d8b3a3418e2-catalog-content\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.187620 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn46b\" (UniqueName: \"kubernetes.io/projected/b8269608-e848-472d-a953-8d8b3a3418e2-kube-api-access-nn46b\") pod \"redhat-marketplace-t6s5g\" (UID: \"b8269608-e848-472d-a953-8d8b3a3418e2\") " pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.254889 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.254948 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvlds"] Jan 30 18:35:18 crc kubenswrapper[4782]: W0130 18:35:18.270331 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858d4185_257a_4486_8147_63381dd9a8f6.slice/crio-8d6a2a510a4bf7ce3f721ee1aa907dc0b6dbad0353a2a6f9ce59981fac8ee876 WatchSource:0}: Error finding container 8d6a2a510a4bf7ce3f721ee1aa907dc0b6dbad0353a2a6f9ce59981fac8ee876: Status 404 returned error can't find the container with id 8d6a2a510a4bf7ce3f721ee1aa907dc0b6dbad0353a2a6f9ce59981fac8ee876 Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.424827 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e13d1a3-9ea0-470c-8e34-c935718e7fcf" path="/var/lib/kubelet/pods/0e13d1a3-9ea0-470c-8e34-c935718e7fcf/volumes" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.427463 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeba928-9384-4789-b6d2-dbc557b815d5" path="/var/lib/kubelet/pods/2eeba928-9384-4789-b6d2-dbc557b815d5/volumes" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.428183 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef2e5ab-f6a0-4735-a5c0-0838345128fb" path="/var/lib/kubelet/pods/8ef2e5ab-f6a0-4735-a5c0-0838345128fb/volumes" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.428785 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913d2663-2aea-4ac0-98bc-eb817aee0f98" path="/var/lib/kubelet/pods/913d2663-2aea-4ac0-98bc-eb817aee0f98/volumes" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.435997 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b84c2e64-bb1c-40ea-a369-55cce87dc7d7" path="/var/lib/kubelet/pods/b84c2e64-bb1c-40ea-a369-55cce87dc7d7/volumes" Jan 30 18:35:18 crc kubenswrapper[4782]: E0130 18:35:18.475579 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod858d4185_257a_4486_8147_63381dd9a8f6.slice/crio-conmon-76f123e35a39e5a33fd16226b506bba2fadc7fad9ebf1bc784424065b9ab2a18.scope\": RecentStats: unable to find data in memory cache]" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.627715 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jg4gn" Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.658314 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t6s5g"] Jan 30 18:35:18 crc kubenswrapper[4782]: W0130 18:35:18.673716 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8269608_e848_472d_a953_8d8b3a3418e2.slice/crio-dd18b4e2f92bf1b85e9cbaa7d3f6147517adc149f2bddd93208f9e6244072c3f WatchSource:0}: Error finding container dd18b4e2f92bf1b85e9cbaa7d3f6147517adc149f2bddd93208f9e6244072c3f: Status 404 returned error can't find the container with id dd18b4e2f92bf1b85e9cbaa7d3f6147517adc149f2bddd93208f9e6244072c3f Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.682932 4782 generic.go:334] "Generic (PLEG): container finished" podID="858d4185-257a-4486-8147-63381dd9a8f6" containerID="76f123e35a39e5a33fd16226b506bba2fadc7fad9ebf1bc784424065b9ab2a18" exitCode=0 Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.683569 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvlds" event={"ID":"858d4185-257a-4486-8147-63381dd9a8f6","Type":"ContainerDied","Data":"76f123e35a39e5a33fd16226b506bba2fadc7fad9ebf1bc784424065b9ab2a18"} Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.683632 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvlds" event={"ID":"858d4185-257a-4486-8147-63381dd9a8f6","Type":"ContainerStarted","Data":"8d6a2a510a4bf7ce3f721ee1aa907dc0b6dbad0353a2a6f9ce59981fac8ee876"} Jan 30 18:35:18 crc kubenswrapper[4782]: I0130 18:35:18.694879 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:35:19 crc kubenswrapper[4782]: I0130 18:35:19.691812 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvlds" event={"ID":"858d4185-257a-4486-8147-63381dd9a8f6","Type":"ContainerStarted","Data":"3dfe955c220b6ffe68e9ee2dc7a669f013e92bc6619deb07c993783b3d7298d4"} Jan 30 18:35:19 crc kubenswrapper[4782]: I0130 18:35:19.695956 4782 generic.go:334] "Generic (PLEG): container finished" podID="b8269608-e848-472d-a953-8d8b3a3418e2" containerID="8f0f77134d3b304eba8365c5c75ead9cf3553f5d179e943e877246e9b13f6d7f" exitCode=0 Jan 30 18:35:19 crc kubenswrapper[4782]: I0130 18:35:19.695997 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6s5g" event={"ID":"b8269608-e848-472d-a953-8d8b3a3418e2","Type":"ContainerDied","Data":"8f0f77134d3b304eba8365c5c75ead9cf3553f5d179e943e877246e9b13f6d7f"} Jan 30 18:35:19 crc kubenswrapper[4782]: I0130 18:35:19.696022 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6s5g" event={"ID":"b8269608-e848-472d-a953-8d8b3a3418e2","Type":"ContainerStarted","Data":"dd18b4e2f92bf1b85e9cbaa7d3f6147517adc149f2bddd93208f9e6244072c3f"} Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.129484 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7ztk"] Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.130784 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.136309 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.145908 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7ztk"] Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.177306 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-utilities\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.177363 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85rc\" (UniqueName: \"kubernetes.io/projected/a6093f28-21a4-43ed-873f-4be71c22abfe-kube-api-access-h85rc\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.177667 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-catalog-content\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.280129 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-catalog-content\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.280296 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-utilities\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.280346 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h85rc\" (UniqueName: \"kubernetes.io/projected/a6093f28-21a4-43ed-873f-4be71c22abfe-kube-api-access-h85rc\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.280683 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-catalog-content\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.280968 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6093f28-21a4-43ed-873f-4be71c22abfe-utilities\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.308430 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85rc\" (UniqueName: \"kubernetes.io/projected/a6093f28-21a4-43ed-873f-4be71c22abfe-kube-api-access-h85rc\") pod \"redhat-operators-f7ztk\" (UID: \"a6093f28-21a4-43ed-873f-4be71c22abfe\") " pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.334639 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jklvb"] Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.335558 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.341467 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jklvb"] Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.342896 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.381445 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-catalog-content\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.381624 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-utilities\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.381691 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25mql\" (UniqueName: \"kubernetes.io/projected/8a2c42a5-1f55-4d42-8696-d59384fa426f-kube-api-access-25mql\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.448626 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.482989 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-utilities\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.483203 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25mql\" (UniqueName: \"kubernetes.io/projected/8a2c42a5-1f55-4d42-8696-d59384fa426f-kube-api-access-25mql\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.483278 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-catalog-content\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.483470 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-utilities\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.484146 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a2c42a5-1f55-4d42-8696-d59384fa426f-catalog-content\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.500431 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25mql\" (UniqueName: \"kubernetes.io/projected/8a2c42a5-1f55-4d42-8696-d59384fa426f-kube-api-access-25mql\") pod \"community-operators-jklvb\" (UID: \"8a2c42a5-1f55-4d42-8696-d59384fa426f\") " pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.650443 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7ztk"] Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.687773 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.702518 4782 generic.go:334] "Generic (PLEG): container finished" podID="b8269608-e848-472d-a953-8d8b3a3418e2" containerID="74ef15eb01ff2fa0d6f6059a3b75a9457d6e4b1348432876080d63da1a1de4bd" exitCode=0 Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.702592 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6s5g" event={"ID":"b8269608-e848-472d-a953-8d8b3a3418e2","Type":"ContainerDied","Data":"74ef15eb01ff2fa0d6f6059a3b75a9457d6e4b1348432876080d63da1a1de4bd"} Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.708073 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7ztk" event={"ID":"a6093f28-21a4-43ed-873f-4be71c22abfe","Type":"ContainerStarted","Data":"8ceb518a48db4aadaa7fb0e21f0136b26cf487dccca00fad234d63aad3345d68"} Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.712159 4782 generic.go:334] "Generic (PLEG): container finished" podID="858d4185-257a-4486-8147-63381dd9a8f6" containerID="3dfe955c220b6ffe68e9ee2dc7a669f013e92bc6619deb07c993783b3d7298d4" exitCode=0 Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.712597 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvlds" event={"ID":"858d4185-257a-4486-8147-63381dd9a8f6","Type":"ContainerDied","Data":"3dfe955c220b6ffe68e9ee2dc7a669f013e92bc6619deb07c993783b3d7298d4"} Jan 30 18:35:20 crc kubenswrapper[4782]: I0130 18:35:20.906758 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jklvb"] Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.722506 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t6s5g" event={"ID":"b8269608-e848-472d-a953-8d8b3a3418e2","Type":"ContainerStarted","Data":"b3127cfa59dbc1df89fdce063fab5e4272b69151a169b7ce9e729ff9b1c644ed"} Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.724387 4782 generic.go:334] "Generic (PLEG): container finished" podID="a6093f28-21a4-43ed-873f-4be71c22abfe" containerID="c4d50b5b575f2f96d83712d5ae04d662d83519326820f585b3da3280f380f7d2" exitCode=0 Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.724445 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7ztk" event={"ID":"a6093f28-21a4-43ed-873f-4be71c22abfe","Type":"ContainerDied","Data":"c4d50b5b575f2f96d83712d5ae04d662d83519326820f585b3da3280f380f7d2"} Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.725858 4782 generic.go:334] "Generic (PLEG): container finished" podID="8a2c42a5-1f55-4d42-8696-d59384fa426f" containerID="f2621193f4c0f40c9f8abcb74b0d8619f2637bd16e903f3f1d988ad34148ccd6" exitCode=0 Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.725891 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jklvb" event={"ID":"8a2c42a5-1f55-4d42-8696-d59384fa426f","Type":"ContainerDied","Data":"f2621193f4c0f40c9f8abcb74b0d8619f2637bd16e903f3f1d988ad34148ccd6"} Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.725943 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jklvb" event={"ID":"8a2c42a5-1f55-4d42-8696-d59384fa426f","Type":"ContainerStarted","Data":"005838a5bace47f161c92151776e70f71ef7b338eae786e268067da574ba8756"} Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.729324 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvlds" event={"ID":"858d4185-257a-4486-8147-63381dd9a8f6","Type":"ContainerStarted","Data":"ca909c71c4b245468e54fc1753ff548d5453493a581aeba169f202e21176fef8"} Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.745708 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t6s5g" podStartSLOduration=3.3469303200000002 podStartE2EDuration="4.745684352s" podCreationTimestamp="2026-01-30 18:35:17 +0000 UTC" firstStartedPulling="2026-01-30 18:35:19.697312525 +0000 UTC m=+295.965690550" lastFinishedPulling="2026-01-30 18:35:21.096066557 +0000 UTC m=+297.364444582" observedRunningTime="2026-01-30 18:35:21.741943027 +0000 UTC m=+298.010321062" watchObservedRunningTime="2026-01-30 18:35:21.745684352 +0000 UTC m=+298.014062377" Jan 30 18:35:21 crc kubenswrapper[4782]: I0130 18:35:21.801809 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvlds" podStartSLOduration=2.171036681 podStartE2EDuration="4.801788607s" podCreationTimestamp="2026-01-30 18:35:17 +0000 UTC" firstStartedPulling="2026-01-30 18:35:18.705552735 +0000 UTC m=+294.973930760" lastFinishedPulling="2026-01-30 18:35:21.336304661 +0000 UTC m=+297.604682686" observedRunningTime="2026-01-30 18:35:21.781873481 +0000 UTC m=+298.050251516" watchObservedRunningTime="2026-01-30 18:35:21.801788607 +0000 UTC m=+298.070166632" Jan 30 18:35:23 crc kubenswrapper[4782]: I0130 18:35:23.745994 4782 generic.go:334] "Generic (PLEG): container finished" podID="a6093f28-21a4-43ed-873f-4be71c22abfe" containerID="7feb81a38229bb28962be0af6b3939c388d55fe0cde6a61e9525a9cc695c5803" exitCode=0 Jan 30 18:35:23 crc kubenswrapper[4782]: I0130 18:35:23.746127 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7ztk" event={"ID":"a6093f28-21a4-43ed-873f-4be71c22abfe","Type":"ContainerDied","Data":"7feb81a38229bb28962be0af6b3939c388d55fe0cde6a61e9525a9cc695c5803"} Jan 30 18:35:23 crc kubenswrapper[4782]: I0130 18:35:23.753521 4782 generic.go:334] "Generic (PLEG): container finished" podID="8a2c42a5-1f55-4d42-8696-d59384fa426f" containerID="785da29ef656d51a1886c05c5af763aaf156124e9b008c4ee29dccff693fa992" exitCode=0 Jan 30 18:35:23 crc kubenswrapper[4782]: I0130 18:35:23.753563 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jklvb" event={"ID":"8a2c42a5-1f55-4d42-8696-d59384fa426f","Type":"ContainerDied","Data":"785da29ef656d51a1886c05c5af763aaf156124e9b008c4ee29dccff693fa992"} Jan 30 18:35:24 crc kubenswrapper[4782]: I0130 18:35:24.190333 4782 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 18:35:24 crc kubenswrapper[4782]: I0130 18:35:24.759186 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7ztk" event={"ID":"a6093f28-21a4-43ed-873f-4be71c22abfe","Type":"ContainerStarted","Data":"5b52806ca82b80877d6f97d142afa347c7b36239ca6bf4fbf901db52075c568b"} Jan 30 18:35:24 crc kubenswrapper[4782]: I0130 18:35:24.761415 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jklvb" event={"ID":"8a2c42a5-1f55-4d42-8696-d59384fa426f","Type":"ContainerStarted","Data":"656d986a9ae3cfbf8b7317d18d666a7486b4023061e0fa0694f3871e430e21fb"} Jan 30 18:35:24 crc kubenswrapper[4782]: I0130 18:35:24.782016 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7ztk" podStartSLOduration=2.11991224 podStartE2EDuration="4.782000962s" podCreationTimestamp="2026-01-30 18:35:20 +0000 UTC" firstStartedPulling="2026-01-30 18:35:21.726053393 +0000 UTC m=+297.994431418" lastFinishedPulling="2026-01-30 18:35:24.388142115 +0000 UTC m=+300.656520140" observedRunningTime="2026-01-30 18:35:24.776853751 +0000 UTC m=+301.045231776" watchObservedRunningTime="2026-01-30 18:35:24.782000962 +0000 UTC m=+301.050378987" Jan 30 18:35:24 crc kubenswrapper[4782]: I0130 18:35:24.794704 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jklvb" podStartSLOduration=2.206204073 podStartE2EDuration="4.794682724s" podCreationTimestamp="2026-01-30 18:35:20 +0000 UTC" firstStartedPulling="2026-01-30 18:35:21.726832513 +0000 UTC m=+297.995210538" lastFinishedPulling="2026-01-30 18:35:24.315311164 +0000 UTC m=+300.583689189" observedRunningTime="2026-01-30 18:35:24.793623707 +0000 UTC m=+301.062001732" watchObservedRunningTime="2026-01-30 18:35:24.794682724 +0000 UTC m=+301.063060749" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.063670 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.064421 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.131265 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.255735 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.256501 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.304644 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.827994 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t6s5g" Jan 30 18:35:28 crc kubenswrapper[4782]: I0130 18:35:28.845914 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvlds" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.448988 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.450208 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.688214 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.688610 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.737042 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:30 crc kubenswrapper[4782]: I0130 18:35:30.840276 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jklvb" Jan 30 18:35:31 crc kubenswrapper[4782]: I0130 18:35:31.499358 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7ztk" podUID="a6093f28-21a4-43ed-873f-4be71c22abfe" containerName="registry-server" probeResult="failure" output=< Jan 30 18:35:31 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 18:35:31 crc kubenswrapper[4782]: > Jan 30 18:35:40 crc kubenswrapper[4782]: I0130 18:35:40.495563 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:40 crc kubenswrapper[4782]: I0130 18:35:40.555514 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7ztk" Jan 30 18:35:43 crc kubenswrapper[4782]: I0130 18:35:43.766096 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" podUID="34770880-dc82-40ff-9989-bbe06f230233" containerName="registry" containerID="cri-o://c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d" gracePeriod=30 Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.224393 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.403591 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.403670 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5tbf\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.403722 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.403763 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.404351 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.405484 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.405166 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.405600 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.405847 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted\") pod \"34770880-dc82-40ff-9989-bbe06f230233\" (UID: \"34770880-dc82-40ff-9989-bbe06f230233\") " Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.407101 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.408481 4782 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.408532 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34770880-dc82-40ff-9989-bbe06f230233-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.414307 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf" (OuterVolumeSpecName: "kube-api-access-v5tbf") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "kube-api-access-v5tbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.414639 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.414970 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.415412 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.420466 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.441382 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "34770880-dc82-40ff-9989-bbe06f230233" (UID: "34770880-dc82-40ff-9989-bbe06f230233"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.509928 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5tbf\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-kube-api-access-v5tbf\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.509988 4782 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.510011 4782 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34770880-dc82-40ff-9989-bbe06f230233-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.510032 4782 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34770880-dc82-40ff-9989-bbe06f230233-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.510052 4782 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34770880-dc82-40ff-9989-bbe06f230233-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.877984 4782 generic.go:334] "Generic (PLEG): container finished" podID="34770880-dc82-40ff-9989-bbe06f230233" containerID="c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d" exitCode=0 Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.878057 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" event={"ID":"34770880-dc82-40ff-9989-bbe06f230233","Type":"ContainerDied","Data":"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d"} Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.878136 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" event={"ID":"34770880-dc82-40ff-9989-bbe06f230233","Type":"ContainerDied","Data":"86499724aff3426fb2deab2c7c2255fc236a71684f1bd9f0f9a7d189195f231e"} Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.878110 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qjchv" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.878202 4782 scope.go:117] "RemoveContainer" containerID="c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.906037 4782 scope.go:117] "RemoveContainer" containerID="c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d" Jan 30 18:35:44 crc kubenswrapper[4782]: E0130 18:35:44.906668 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d\": container with ID starting with c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d not found: ID does not exist" containerID="c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.906734 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d"} err="failed to get container status \"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d\": rpc error: code = NotFound desc = could not find container \"c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d\": container with ID starting with c926d0d2011b56e6066814e7c81aaf7fc9c24a3d834a8990d01db0a34c9cbf4d not found: ID does not exist" Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.931455 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:35:44 crc kubenswrapper[4782]: I0130 18:35:44.939254 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qjchv"] Jan 30 18:35:46 crc kubenswrapper[4782]: I0130 18:35:46.423906 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34770880-dc82-40ff-9989-bbe06f230233" path="/var/lib/kubelet/pods/34770880-dc82-40ff-9989-bbe06f230233/volumes" Jan 30 18:36:19 crc kubenswrapper[4782]: I0130 18:36:19.792868 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:36:19 crc kubenswrapper[4782]: I0130 18:36:19.793594 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:36:49 crc kubenswrapper[4782]: I0130 18:36:49.793181 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:36:49 crc kubenswrapper[4782]: I0130 18:36:49.794035 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:37:19 crc kubenswrapper[4782]: I0130 18:37:19.793134 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:37:19 crc kubenswrapper[4782]: I0130 18:37:19.793941 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:37:19 crc kubenswrapper[4782]: I0130 18:37:19.794036 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:37:19 crc kubenswrapper[4782]: I0130 18:37:19.794860 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:37:19 crc kubenswrapper[4782]: I0130 18:37:19.794944 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688" gracePeriod=600 Jan 30 18:37:20 crc kubenswrapper[4782]: I0130 18:37:20.539843 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688" exitCode=0 Jan 30 18:37:20 crc kubenswrapper[4782]: I0130 18:37:20.539970 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688"} Jan 30 18:37:20 crc kubenswrapper[4782]: I0130 18:37:20.540379 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3"} Jan 30 18:37:20 crc kubenswrapper[4782]: I0130 18:37:20.540423 4782 scope.go:117] "RemoveContainer" containerID="b2a0d39c1e6147d8ef68aae5bf24b942a57ebd4574f48fabe6e7064cbfa21267" Jan 30 18:37:24 crc kubenswrapper[4782]: I0130 18:37:24.806071 4782 scope.go:117] "RemoveContainer" containerID="08d773ca39566ae2c56c91e635802de7c7ddc86cd75bda0d6cb6fc8d4b81df12" Jan 30 18:39:49 crc kubenswrapper[4782]: I0130 18:39:49.792997 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:39:49 crc kubenswrapper[4782]: I0130 18:39:49.793791 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.531728 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf"] Jan 30 18:40:05 crc kubenswrapper[4782]: E0130 18:40:05.532413 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34770880-dc82-40ff-9989-bbe06f230233" containerName="registry" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.532426 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="34770880-dc82-40ff-9989-bbe06f230233" containerName="registry" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.532538 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="34770880-dc82-40ff-9989-bbe06f230233" containerName="registry" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.533038 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.535766 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.537646 4782 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vsn4r" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.537856 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.541009 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-wspn9"] Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.541873 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-wspn9" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.551440 4782 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2b8d6" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.555054 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf"] Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.561145 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-wspn9"] Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.571353 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s26rq"] Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.571964 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.575937 4782 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-64xjq" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.581870 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s26rq"] Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.636757 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvx84\" (UniqueName: \"kubernetes.io/projected/f8175d03-4ab5-4ed7-ab43-c722ef6a33b3-kube-api-access-nvx84\") pod \"cert-manager-webhook-687f57d79b-s26rq\" (UID: \"f8175d03-4ab5-4ed7-ab43-c722ef6a33b3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.636910 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42qs\" (UniqueName: \"kubernetes.io/projected/afb307de-3731-434f-bbf7-3f8fcd8cd336-kube-api-access-h42qs\") pod \"cert-manager-858654f9db-wspn9\" (UID: \"afb307de-3731-434f-bbf7-3f8fcd8cd336\") " pod="cert-manager/cert-manager-858654f9db-wspn9" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.637090 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rcqj\" (UniqueName: \"kubernetes.io/projected/bd00add1-aab0-4229-837f-7f79d71ad160-kube-api-access-6rcqj\") pod \"cert-manager-cainjector-cf98fcc89-8hdgf\" (UID: \"bd00add1-aab0-4229-837f-7f79d71ad160\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.738808 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rcqj\" (UniqueName: \"kubernetes.io/projected/bd00add1-aab0-4229-837f-7f79d71ad160-kube-api-access-6rcqj\") pod \"cert-manager-cainjector-cf98fcc89-8hdgf\" (UID: \"bd00add1-aab0-4229-837f-7f79d71ad160\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.738877 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvx84\" (UniqueName: \"kubernetes.io/projected/f8175d03-4ab5-4ed7-ab43-c722ef6a33b3-kube-api-access-nvx84\") pod \"cert-manager-webhook-687f57d79b-s26rq\" (UID: \"f8175d03-4ab5-4ed7-ab43-c722ef6a33b3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.738945 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h42qs\" (UniqueName: \"kubernetes.io/projected/afb307de-3731-434f-bbf7-3f8fcd8cd336-kube-api-access-h42qs\") pod \"cert-manager-858654f9db-wspn9\" (UID: \"afb307de-3731-434f-bbf7-3f8fcd8cd336\") " pod="cert-manager/cert-manager-858654f9db-wspn9" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.768830 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvx84\" (UniqueName: \"kubernetes.io/projected/f8175d03-4ab5-4ed7-ab43-c722ef6a33b3-kube-api-access-nvx84\") pod \"cert-manager-webhook-687f57d79b-s26rq\" (UID: \"f8175d03-4ab5-4ed7-ab43-c722ef6a33b3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.768983 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rcqj\" (UniqueName: \"kubernetes.io/projected/bd00add1-aab0-4229-837f-7f79d71ad160-kube-api-access-6rcqj\") pod \"cert-manager-cainjector-cf98fcc89-8hdgf\" (UID: \"bd00add1-aab0-4229-837f-7f79d71ad160\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.769821 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42qs\" (UniqueName: \"kubernetes.io/projected/afb307de-3731-434f-bbf7-3f8fcd8cd336-kube-api-access-h42qs\") pod \"cert-manager-858654f9db-wspn9\" (UID: \"afb307de-3731-434f-bbf7-3f8fcd8cd336\") " pod="cert-manager/cert-manager-858654f9db-wspn9" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.856682 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.866163 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-wspn9" Jan 30 18:40:05 crc kubenswrapper[4782]: I0130 18:40:05.884022 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.135480 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s26rq"] Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.149119 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.312835 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf"] Jan 30 18:40:06 crc kubenswrapper[4782]: W0130 18:40:06.316075 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb307de_3731_434f_bbf7_3f8fcd8cd336.slice/crio-b5091edb594fc31d06b8ec2b715633dc5a1fac1604cfcab089844dfb2294b668 WatchSource:0}: Error finding container b5091edb594fc31d06b8ec2b715633dc5a1fac1604cfcab089844dfb2294b668: Status 404 returned error can't find the container with id b5091edb594fc31d06b8ec2b715633dc5a1fac1604cfcab089844dfb2294b668 Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.320831 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-wspn9"] Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.603167 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-wspn9" event={"ID":"afb307de-3731-434f-bbf7-3f8fcd8cd336","Type":"ContainerStarted","Data":"b5091edb594fc31d06b8ec2b715633dc5a1fac1604cfcab089844dfb2294b668"} Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.604467 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" event={"ID":"bd00add1-aab0-4229-837f-7f79d71ad160","Type":"ContainerStarted","Data":"268dbddc11a7b4019f581f9173670c5a804ebd3261db381e3ab52cac42214e04"} Jan 30 18:40:06 crc kubenswrapper[4782]: I0130 18:40:06.605619 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" event={"ID":"f8175d03-4ab5-4ed7-ab43-c722ef6a33b3","Type":"ContainerStarted","Data":"36b01b0fdc394ef5569bfdf3bb5b751c74f697402c17843f9adbf2363dae6131"} Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.636210 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" event={"ID":"bd00add1-aab0-4229-837f-7f79d71ad160","Type":"ContainerStarted","Data":"9b58d552d247c6020e9b89e769d84ab442110ddf60020105cd27604da1d2d23e"} Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.638784 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" event={"ID":"f8175d03-4ab5-4ed7-ab43-c722ef6a33b3","Type":"ContainerStarted","Data":"838513b8dbee634518b494e258f3d72a251f62df5099e3671f66c3d7c7e2c9ba"} Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.638887 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.640509 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-wspn9" event={"ID":"afb307de-3731-434f-bbf7-3f8fcd8cd336","Type":"ContainerStarted","Data":"a1f14b4ecf19833190904f1e1dc59772501a66f5a9d8a5d4ed4423727e69b865"} Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.699041 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8hdgf" podStartSLOduration=1.912459185 podStartE2EDuration="5.699013899s" podCreationTimestamp="2026-01-30 18:40:05 +0000 UTC" firstStartedPulling="2026-01-30 18:40:06.305907241 +0000 UTC m=+582.574285266" lastFinishedPulling="2026-01-30 18:40:10.092461935 +0000 UTC m=+586.360839980" observedRunningTime="2026-01-30 18:40:10.653621229 +0000 UTC m=+586.921999254" watchObservedRunningTime="2026-01-30 18:40:10.699013899 +0000 UTC m=+586.967391964" Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.699476 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-wspn9" podStartSLOduration=2.031593489 podStartE2EDuration="5.699462s" podCreationTimestamp="2026-01-30 18:40:05 +0000 UTC" firstStartedPulling="2026-01-30 18:40:06.321428695 +0000 UTC m=+582.589806710" lastFinishedPulling="2026-01-30 18:40:09.989297196 +0000 UTC m=+586.257675221" observedRunningTime="2026-01-30 18:40:10.696852334 +0000 UTC m=+586.965230399" watchObservedRunningTime="2026-01-30 18:40:10.699462 +0000 UTC m=+586.967840095" Jan 30 18:40:10 crc kubenswrapper[4782]: I0130 18:40:10.728666 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" podStartSLOduration=1.888199425 podStartE2EDuration="5.728645962s" podCreationTimestamp="2026-01-30 18:40:05 +0000 UTC" firstStartedPulling="2026-01-30 18:40:06.14889974 +0000 UTC m=+582.417277765" lastFinishedPulling="2026-01-30 18:40:09.989346237 +0000 UTC m=+586.257724302" observedRunningTime="2026-01-30 18:40:10.72217359 +0000 UTC m=+586.990551695" watchObservedRunningTime="2026-01-30 18:40:10.728645962 +0000 UTC m=+586.997024007" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.253083 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lxk6x"] Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.256686 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-controller" containerID="cri-o://253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257064 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="sbdb" containerID="cri-o://5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257110 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="nbdb" containerID="cri-o://c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257139 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="northd" containerID="cri-o://0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257203 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257251 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-node" containerID="cri-o://47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.257281 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-acl-logging" containerID="cri-o://084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.296867 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovnkube-controller" containerID="cri-o://7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" gracePeriod=30 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.532220 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxk6x_dd1fb9ae-9c56-4d08-b0ef-c661158367ce/ovn-acl-logging/0.log" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.533243 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxk6x_dd1fb9ae-9c56-4d08-b0ef-c661158367ce/ovn-controller/0.log" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.533730 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572469 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572513 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572538 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572556 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572580 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572594 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572591 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log" (OuterVolumeSpecName: "node-log") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572606 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572632 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572654 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572679 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572684 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572730 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572706 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572769 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572812 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572839 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572862 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572889 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572914 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572949 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572981 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4r8x\" (UniqueName: \"kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573026 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572710 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573040 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572769 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573058 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes\") pod \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\" (UID: \"dd1fb9ae-9c56-4d08-b0ef-c661158367ce\") " Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.572798 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket" (OuterVolumeSpecName: "log-socket") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573013 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573038 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash" (OuterVolumeSpecName: "host-slash") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573075 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573209 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573081 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573131 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573158 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573453 4782 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573479 4782 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573497 4782 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573515 4782 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573532 4782 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573550 4782 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573568 4782 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573583 4782 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573598 4782 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573613 4782 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573628 4782 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573644 4782 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573661 4782 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573676 4782 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573687 4782 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573803 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.573827 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.580782 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.581780 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x" (OuterVolumeSpecName: "kube-api-access-x4r8x") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "kube-api-access-x4r8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.592061 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "dd1fb9ae-9c56-4d08-b0ef-c661158367ce" (UID: "dd1fb9ae-9c56-4d08-b0ef-c661158367ce"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.597743 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dx9s9"] Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.598181 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.598327 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.598429 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovnkube-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.598513 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovnkube-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.598595 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="sbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.598662 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="sbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.598736 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-node" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.598804 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-node" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.598872 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-acl-logging" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.598934 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-acl-logging" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.599004 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="northd" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599074 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="northd" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.599140 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599207 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.599299 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="nbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599374 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="nbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.599438 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kubecfg-setup" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599503 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kubecfg-setup" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599677 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-node" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599779 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-acl-logging" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599854 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="nbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599927 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.599992 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="northd" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.600058 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovn-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.600135 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="ovnkube-controller" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.600204 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerName="sbdb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.602718 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674018 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-netns\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674070 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-log-socket\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674097 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-script-lib\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674124 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-systemd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674145 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-ovn\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674166 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-var-lib-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674188 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674214 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-bin\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674266 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-config\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674566 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-etc-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674654 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-slash\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674779 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-env-overrides\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674828 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-systemd-units\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674884 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674925 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.674980 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-netd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675039 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-node-log\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675091 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2qk\" (UniqueName: \"kubernetes.io/projected/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-kube-api-access-pw2qk\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675153 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-kubelet\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675311 4782 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675345 4782 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675378 4782 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675406 4782 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675438 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4r8x\" (UniqueName: \"kubernetes.io/projected/dd1fb9ae-9c56-4d08-b0ef-c661158367ce-kube-api-access-x4r8x\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675476 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qdgpq_c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6/kube-multus/0.log" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675554 4782 generic.go:334] "Generic (PLEG): container finished" podID="c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6" containerID="7190924486a58947a7a39b8e6ae7a95007953ffcb2ccc40f7d61e2bf38e80b2c" exitCode=2 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.675657 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qdgpq" event={"ID":"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6","Type":"ContainerDied","Data":"7190924486a58947a7a39b8e6ae7a95007953ffcb2ccc40f7d61e2bf38e80b2c"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.676481 4782 scope.go:117] "RemoveContainer" containerID="7190924486a58947a7a39b8e6ae7a95007953ffcb2ccc40f7d61e2bf38e80b2c" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.683823 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxk6x_dd1fb9ae-9c56-4d08-b0ef-c661158367ce/ovn-acl-logging/0.log" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.684807 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lxk6x_dd1fb9ae-9c56-4d08-b0ef-c661158367ce/ovn-controller/0.log" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685378 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685418 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685436 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685453 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685471 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685492 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" exitCode=0 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685512 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" exitCode=143 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685531 4782 generic.go:334] "Generic (PLEG): container finished" podID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" exitCode=143 Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685566 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685611 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685639 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685666 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685693 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685719 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685746 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685767 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685783 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685803 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685823 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685840 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685855 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685871 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685887 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685903 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685918 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685932 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685946 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685966 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.685989 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686007 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686022 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686037 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686054 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686068 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686083 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686098 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686112 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686131 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" event={"ID":"dd1fb9ae-9c56-4d08-b0ef-c661158367ce","Type":"ContainerDied","Data":"3b5b137ae156ad5ef5ed0bf5b934def26c5189163af9d7b46198d06ebd991cda"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686154 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686170 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686184 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686197 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686212 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686261 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686277 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686292 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686306 4782 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686333 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.686609 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lxk6x" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.719168 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.747444 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776599 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-systemd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776667 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-ovn\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776717 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-var-lib-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776742 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776805 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-bin\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776837 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-config\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776919 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.776977 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-etc-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777070 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-slash\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777108 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-systemd-units\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777166 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-env-overrides\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777214 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777259 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777280 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-netd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777332 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-node-log\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777379 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2qk\" (UniqueName: \"kubernetes.io/projected/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-kube-api-access-pw2qk\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777406 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-kubelet\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777473 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-netns\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777502 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-log-socket\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.777521 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-script-lib\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778344 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-systemd-units\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778445 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-bin\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778553 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-systemd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778598 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-ovn\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778637 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-var-lib-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778671 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778728 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-script-lib\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778816 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-netns\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778854 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-log-socket\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778884 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lxk6x"] Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.778931 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-run-ovn-kubernetes\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779102 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovnkube-config\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779155 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-env-overrides\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779160 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-etc-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779189 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-slash\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779262 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-kubelet\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779300 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-host-cni-netd\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779339 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-run-openvswitch\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.779369 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-node-log\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.780982 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lxk6x"] Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.787891 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-ovn-node-metrics-cert\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.791043 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.803743 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2qk\" (UniqueName: \"kubernetes.io/projected/1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33-kube-api-access-pw2qk\") pod \"ovnkube-node-dx9s9\" (UID: \"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33\") " pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.810830 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.825284 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.838464 4782 scope.go:117] "RemoveContainer" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.852459 4782 scope.go:117] "RemoveContainer" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.867726 4782 scope.go:117] "RemoveContainer" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.884862 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.885269 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.885311 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} err="failed to get container status \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.885344 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.885969 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.886072 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} err="failed to get container status \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.886107 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.886967 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.889116 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.889172 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} err="failed to get container status \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.889203 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.889630 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.889665 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} err="failed to get container status \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.889687 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.890136 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.890171 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} err="failed to get container status \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.890187 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.890743 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.890775 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} err="failed to get container status \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.890788 4782 scope.go:117] "RemoveContainer" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.891108 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": container with ID starting with 084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221 not found: ID does not exist" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891136 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} err="failed to get container status \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": rpc error: code = NotFound desc = could not find container \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": container with ID starting with 084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891153 4782 scope.go:117] "RemoveContainer" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.891488 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": container with ID starting with 253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb not found: ID does not exist" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891511 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} err="failed to get container status \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": rpc error: code = NotFound desc = could not find container \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": container with ID starting with 253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891527 4782 scope.go:117] "RemoveContainer" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: E0130 18:40:15.891813 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": container with ID starting with 099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8 not found: ID does not exist" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891842 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} err="failed to get container status \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": rpc error: code = NotFound desc = could not find container \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": container with ID starting with 099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.891881 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892263 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} err="failed to get container status \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892289 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892595 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} err="failed to get container status \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892614 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892892 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} err="failed to get container status \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.892915 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893155 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} err="failed to get container status \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893172 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893417 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} err="failed to get container status \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893434 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893647 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} err="failed to get container status \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893664 4782 scope.go:117] "RemoveContainer" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893915 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} err="failed to get container status \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": rpc error: code = NotFound desc = could not find container \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": container with ID starting with 084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.893933 4782 scope.go:117] "RemoveContainer" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894192 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} err="failed to get container status \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": rpc error: code = NotFound desc = could not find container \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": container with ID starting with 253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894208 4782 scope.go:117] "RemoveContainer" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894452 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} err="failed to get container status \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": rpc error: code = NotFound desc = could not find container \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": container with ID starting with 099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894469 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894731 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} err="failed to get container status \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894749 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894976 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} err="failed to get container status \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.894991 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895242 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} err="failed to get container status \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895256 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895541 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} err="failed to get container status \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895559 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895789 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} err="failed to get container status \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.895809 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896139 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} err="failed to get container status \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896155 4782 scope.go:117] "RemoveContainer" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896398 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} err="failed to get container status \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": rpc error: code = NotFound desc = could not find container \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": container with ID starting with 084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896417 4782 scope.go:117] "RemoveContainer" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896714 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} err="failed to get container status \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": rpc error: code = NotFound desc = could not find container \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": container with ID starting with 253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.896732 4782 scope.go:117] "RemoveContainer" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.897708 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} err="failed to get container status \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": rpc error: code = NotFound desc = could not find container \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": container with ID starting with 099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.897726 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.897917 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} err="failed to get container status \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.897935 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898142 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} err="failed to get container status \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898167 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898388 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} err="failed to get container status \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898405 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898602 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} err="failed to get container status \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898618 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898795 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} err="failed to get container status \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.898814 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.899561 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} err="failed to get container status \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.899587 4782 scope.go:117] "RemoveContainer" containerID="084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.899825 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221"} err="failed to get container status \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": rpc error: code = NotFound desc = could not find container \"084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221\": container with ID starting with 084cfbfeff46431c4a38aa1ec97494fad106870a90fbe47108801502f6160221 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.899843 4782 scope.go:117] "RemoveContainer" containerID="253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.900160 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb"} err="failed to get container status \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": rpc error: code = NotFound desc = could not find container \"253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb\": container with ID starting with 253eb5b62a092aa6b53788b1355de606862ea7c14f46dd55c8eec6e586836edb not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.900185 4782 scope.go:117] "RemoveContainer" containerID="099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.902391 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8"} err="failed to get container status \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": rpc error: code = NotFound desc = could not find container \"099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8\": container with ID starting with 099aca08579cbc035c6990e4866db4a6ddd76ddb07de4ee5545f9270e39888f8 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.902420 4782 scope.go:117] "RemoveContainer" containerID="7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903156 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac"} err="failed to get container status \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": rpc error: code = NotFound desc = could not find container \"7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac\": container with ID starting with 7e16e4ddf695a52eb48dedeb794cd0fe74604c6d565b7c53d1d9b9073c5346ac not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903188 4782 scope.go:117] "RemoveContainer" containerID="5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903553 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39"} err="failed to get container status \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": rpc error: code = NotFound desc = could not find container \"5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39\": container with ID starting with 5f76a049234d9b885c84f6bdda0753eae820c6057281370e4ca578e20c18bd39 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903578 4782 scope.go:117] "RemoveContainer" containerID="c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903852 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5"} err="failed to get container status \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": rpc error: code = NotFound desc = could not find container \"c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5\": container with ID starting with c354e34f514cbff4a3bc4d58de2851ea2eb82a97122bc260bf68d835f07c20e5 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.903927 4782 scope.go:117] "RemoveContainer" containerID="0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.904264 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef"} err="failed to get container status \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": rpc error: code = NotFound desc = could not find container \"0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef\": container with ID starting with 0d20d4feeb05d1b381a45f372cdc76f41f8113fba3ce52f6cb055ed2a29335ef not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.904288 4782 scope.go:117] "RemoveContainer" containerID="863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.904797 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e"} err="failed to get container status \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": rpc error: code = NotFound desc = could not find container \"863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e\": container with ID starting with 863e4f13fcd73f554374d239b03a1a9ebab75efa0aa8dbb9504d6c7424823b4e not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.904830 4782 scope.go:117] "RemoveContainer" containerID="47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.905343 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721"} err="failed to get container status \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": rpc error: code = NotFound desc = could not find container \"47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721\": container with ID starting with 47e88b74499d5c874443c7eaebf04f7eaa3e8a3cbb3f95ed5a9e699fd85bc721 not found: ID does not exist" Jan 30 18:40:15 crc kubenswrapper[4782]: I0130 18:40:15.930815 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.422761 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1fb9ae-9c56-4d08-b0ef-c661158367ce" path="/var/lib/kubelet/pods/dd1fb9ae-9c56-4d08-b0ef-c661158367ce/volumes" Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.697652 4782 generic.go:334] "Generic (PLEG): container finished" podID="1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33" containerID="21df747118c1ee30b44b084d1e9ad2043a7741d3e66f5f30a611983f2b8afd7e" exitCode=0 Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.697745 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerDied","Data":"21df747118c1ee30b44b084d1e9ad2043a7741d3e66f5f30a611983f2b8afd7e"} Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.697803 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"47225adce8f584707e7ff79f682b285659ce52bd4c3b22711bd3788fa0e27960"} Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.701826 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qdgpq_c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6/kube-multus/0.log" Jan 30 18:40:16 crc kubenswrapper[4782]: I0130 18:40:16.701929 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qdgpq" event={"ID":"c3e9fe26-a9ff-4c88-af9a-695c9a46ffe6","Type":"ContainerStarted","Data":"43e8ad7beb07b182b2d48282e93ecd07993e1907c665f606935ef8e723b680fc"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.711743 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"9b4f397ccdf7b1f1547e48032eb74a9aaf444e907670d2b98b138f74cd407885"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.712062 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"ac7f8a13c39daffb12e0831327b768f4e429e21450ca4a916c0f959371a7885c"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.712080 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"b90a51a8424aa56af9e69ad15eec4444db6f667d4d18938accb341f6b951aa08"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.712093 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"614830bfdf754f1a86c515e60de14f4b100ac3addb976554c262b16250b22b66"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.712104 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"bc875da111e274f3164b57180e7d3e9cf4cb1c79db53569eec18e308ba979954"} Jan 30 18:40:17 crc kubenswrapper[4782]: I0130 18:40:17.712116 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"1ed91165fc204f2b01bf838fe4ae0e3e8b1ce798ddc3606a45a1f7df3586b3d2"} Jan 30 18:40:19 crc kubenswrapper[4782]: I0130 18:40:19.735406 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"34078fba32ed113fcd5a0f1d8efda9ea15eb791d88ca63386be595423168d849"} Jan 30 18:40:19 crc kubenswrapper[4782]: I0130 18:40:19.793143 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:40:19 crc kubenswrapper[4782]: I0130 18:40:19.793251 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:40:22 crc kubenswrapper[4782]: I0130 18:40:22.755092 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" event={"ID":"1e5e0c36-fbd6-4fa6-a0d7-45c0a604ea33","Type":"ContainerStarted","Data":"1aac0b74a259c02779813c9fbd910383ee05c41701aa7ba9999b662af3829e21"} Jan 30 18:40:22 crc kubenswrapper[4782]: I0130 18:40:22.755809 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:22 crc kubenswrapper[4782]: I0130 18:40:22.782492 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:22 crc kubenswrapper[4782]: I0130 18:40:22.812533 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" podStartSLOduration=7.812513708 podStartE2EDuration="7.812513708s" podCreationTimestamp="2026-01-30 18:40:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:40:22.808798225 +0000 UTC m=+599.077176250" watchObservedRunningTime="2026-01-30 18:40:22.812513708 +0000 UTC m=+599.080891733" Jan 30 18:40:23 crc kubenswrapper[4782]: I0130 18:40:23.760828 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:23 crc kubenswrapper[4782]: I0130 18:40:23.761210 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:23 crc kubenswrapper[4782]: I0130 18:40:23.795046 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.470028 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm"] Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.473410 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.479320 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.485479 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm"] Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.495743 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.495895 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.495950 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-425w8\" (UniqueName: \"kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.597754 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.596992 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.597896 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-425w8\" (UniqueName: \"kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.599422 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.600743 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.631517 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-425w8\" (UniqueName: \"kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:43 crc kubenswrapper[4782]: I0130 18:40:43.807995 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:44 crc kubenswrapper[4782]: I0130 18:40:44.130001 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm"] Jan 30 18:40:44 crc kubenswrapper[4782]: W0130 18:40:44.142516 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaebabea_1b8e_4b77_9ea1_7cd8c7270caf.slice/crio-8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef WatchSource:0}: Error finding container 8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef: Status 404 returned error can't find the container with id 8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef Jan 30 18:40:44 crc kubenswrapper[4782]: I0130 18:40:44.981313 4782 generic.go:334] "Generic (PLEG): container finished" podID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerID="3e801d2ec58d29f4be9f5688110d47f4bd1198a6359b726673740a1b0d87b40e" exitCode=0 Jan 30 18:40:44 crc kubenswrapper[4782]: I0130 18:40:44.981470 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" event={"ID":"daebabea-1b8e-4b77-9ea1-7cd8c7270caf","Type":"ContainerDied","Data":"3e801d2ec58d29f4be9f5688110d47f4bd1198a6359b726673740a1b0d87b40e"} Jan 30 18:40:44 crc kubenswrapper[4782]: I0130 18:40:44.982479 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" event={"ID":"daebabea-1b8e-4b77-9ea1-7cd8c7270caf","Type":"ContainerStarted","Data":"8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef"} Jan 30 18:40:45 crc kubenswrapper[4782]: I0130 18:40:45.965041 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dx9s9" Jan 30 18:40:47 crc kubenswrapper[4782]: I0130 18:40:47.001296 4782 generic.go:334] "Generic (PLEG): container finished" podID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerID="55d293ec649246459d2fd2bd9ad8e6d981c17b85dd3bff258434473d2e36af9b" exitCode=0 Jan 30 18:40:47 crc kubenswrapper[4782]: I0130 18:40:47.001399 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" event={"ID":"daebabea-1b8e-4b77-9ea1-7cd8c7270caf","Type":"ContainerDied","Data":"55d293ec649246459d2fd2bd9ad8e6d981c17b85dd3bff258434473d2e36af9b"} Jan 30 18:40:48 crc kubenswrapper[4782]: I0130 18:40:48.031867 4782 generic.go:334] "Generic (PLEG): container finished" podID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerID="237c09ecf4245294d61fd9800581070ed4d314d86813c6e7f06d509a2e5a27ed" exitCode=0 Jan 30 18:40:48 crc kubenswrapper[4782]: I0130 18:40:48.032014 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" event={"ID":"daebabea-1b8e-4b77-9ea1-7cd8c7270caf","Type":"ContainerDied","Data":"237c09ecf4245294d61fd9800581070ed4d314d86813c6e7f06d509a2e5a27ed"} Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.298929 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.385843 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle\") pod \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.386172 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util\") pod \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.386206 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-425w8\" (UniqueName: \"kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8\") pod \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\" (UID: \"daebabea-1b8e-4b77-9ea1-7cd8c7270caf\") " Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.389495 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle" (OuterVolumeSpecName: "bundle") pod "daebabea-1b8e-4b77-9ea1-7cd8c7270caf" (UID: "daebabea-1b8e-4b77-9ea1-7cd8c7270caf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.397474 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8" (OuterVolumeSpecName: "kube-api-access-425w8") pod "daebabea-1b8e-4b77-9ea1-7cd8c7270caf" (UID: "daebabea-1b8e-4b77-9ea1-7cd8c7270caf"). InnerVolumeSpecName "kube-api-access-425w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.405345 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util" (OuterVolumeSpecName: "util") pod "daebabea-1b8e-4b77-9ea1-7cd8c7270caf" (UID: "daebabea-1b8e-4b77-9ea1-7cd8c7270caf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.487998 4782 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.488027 4782 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-util\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.488036 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-425w8\" (UniqueName: \"kubernetes.io/projected/daebabea-1b8e-4b77-9ea1-7cd8c7270caf-kube-api-access-425w8\") on node \"crc\" DevicePath \"\"" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.793524 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.794001 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.794340 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.795990 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:40:49 crc kubenswrapper[4782]: I0130 18:40:49.796282 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3" gracePeriod=600 Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.045009 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3" exitCode=0 Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.045070 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3"} Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.045137 4782 scope.go:117] "RemoveContainer" containerID="58c67a71506ebafa777e7bddef3e26a0ebec68fccc8ce52f841113c827923688" Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.048779 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" event={"ID":"daebabea-1b8e-4b77-9ea1-7cd8c7270caf","Type":"ContainerDied","Data":"8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef"} Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.048836 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8743c01f589f30ebab0e1ae7397a017197c5d05526b922dd6e44baed621966ef" Jan 30 18:40:50 crc kubenswrapper[4782]: I0130 18:40:50.048861 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm" Jan 30 18:40:51 crc kubenswrapper[4782]: I0130 18:40:51.058134 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f"} Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.129140 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq"] Jan 30 18:41:01 crc kubenswrapper[4782]: E0130 18:41:01.130032 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="pull" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.130050 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="pull" Jan 30 18:41:01 crc kubenswrapper[4782]: E0130 18:41:01.130076 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="extract" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.130085 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="extract" Jan 30 18:41:01 crc kubenswrapper[4782]: E0130 18:41:01.130098 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="util" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.130109 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="util" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.130222 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="daebabea-1b8e-4b77-9ea1-7cd8c7270caf" containerName="extract" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.130682 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.134804 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6p9k\" (UniqueName: \"kubernetes.io/projected/92e82803-8b7d-46f3-ba40-2900590261cf-kube-api-access-v6p9k\") pod \"obo-prometheus-operator-68bc856cb9-kmgbq\" (UID: \"92e82803-8b7d-46f3-ba40-2900590261cf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.136250 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.137045 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-vlzwj" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.137205 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.189701 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.190366 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.192908 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.193022 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-6wsqq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.233581 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.238154 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.238342 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6p9k\" (UniqueName: \"kubernetes.io/projected/92e82803-8b7d-46f3-ba40-2900590261cf-kube-api-access-v6p9k\") pod \"obo-prometheus-operator-68bc856cb9-kmgbq\" (UID: \"92e82803-8b7d-46f3-ba40-2900590261cf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.238372 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.245222 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.255366 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.256542 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.258675 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.260307 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6p9k\" (UniqueName: \"kubernetes.io/projected/92e82803-8b7d-46f3-ba40-2900590261cf-kube-api-access-v6p9k\") pod \"obo-prometheus-operator-68bc856cb9-kmgbq\" (UID: \"92e82803-8b7d-46f3-ba40-2900590261cf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.343259 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.343336 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.343411 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.343441 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.346524 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.363043 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pcglc"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.363672 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.364396 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05098fbb-e910-4fec-8a31-fd98d476b941-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp\" (UID: \"05098fbb-e910-4fec-8a31-fd98d476b941\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.367019 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.367286 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-zn2vf" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.396182 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pcglc"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.444612 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.444699 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/786ed08c-6b06-4e44-aaf4-5562ef433b88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.444726 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7qv\" (UniqueName: \"kubernetes.io/projected/786ed08c-6b06-4e44-aaf4-5562ef433b88-kube-api-access-bn7qv\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.444768 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.449396 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.449747 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19b18d8a-aa0f-494e-9e56-55bceba788c6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22\" (UID: \"19b18d8a-aa0f-494e-9e56-55bceba788c6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.454352 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.496272 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w89kr"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.497043 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.499114 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qxwgf" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.511007 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.546642 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/616c3ea8-075a-475f-9896-180a02e4cc3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.546715 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/786ed08c-6b06-4e44-aaf4-5562ef433b88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.546741 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7qv\" (UniqueName: \"kubernetes.io/projected/786ed08c-6b06-4e44-aaf4-5562ef433b88-kube-api-access-bn7qv\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.546781 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcsgt\" (UniqueName: \"kubernetes.io/projected/616c3ea8-075a-475f-9896-180a02e4cc3f-kube-api-access-rcsgt\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.553047 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/786ed08c-6b06-4e44-aaf4-5562ef433b88-observability-operator-tls\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.576370 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w89kr"] Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.593454 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.595895 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7qv\" (UniqueName: \"kubernetes.io/projected/786ed08c-6b06-4e44-aaf4-5562ef433b88-kube-api-access-bn7qv\") pod \"observability-operator-59bdc8b94-pcglc\" (UID: \"786ed08c-6b06-4e44-aaf4-5562ef433b88\") " pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.647909 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcsgt\" (UniqueName: \"kubernetes.io/projected/616c3ea8-075a-475f-9896-180a02e4cc3f-kube-api-access-rcsgt\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.647983 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/616c3ea8-075a-475f-9896-180a02e4cc3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.648752 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/616c3ea8-075a-475f-9896-180a02e4cc3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.688669 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcsgt\" (UniqueName: \"kubernetes.io/projected/616c3ea8-075a-475f-9896-180a02e4cc3f-kube-api-access-rcsgt\") pod \"perses-operator-5bf474d74f-w89kr\" (UID: \"616c3ea8-075a-475f-9896-180a02e4cc3f\") " pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.706538 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.811470 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.836955 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq"] Jan 30 18:41:01 crc kubenswrapper[4782]: W0130 18:41:01.878347 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e82803_8b7d_46f3_ba40_2900590261cf.slice/crio-49fcac3dac2ce912a224364d880b927e3244ad24771b6be635556316ea83deb8 WatchSource:0}: Error finding container 49fcac3dac2ce912a224364d880b927e3244ad24771b6be635556316ea83deb8: Status 404 returned error can't find the container with id 49fcac3dac2ce912a224364d880b927e3244ad24771b6be635556316ea83deb8 Jan 30 18:41:01 crc kubenswrapper[4782]: I0130 18:41:01.917049 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp"] Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.058901 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-pcglc"] Jan 30 18:41:02 crc kubenswrapper[4782]: W0130 18:41:02.064197 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786ed08c_6b06_4e44_aaf4_5562ef433b88.slice/crio-bf78b5131d29dd92378762cf2f243ebedad686a9316b712281bccfd76c8c042e WatchSource:0}: Error finding container bf78b5131d29dd92378762cf2f243ebedad686a9316b712281bccfd76c8c042e: Status 404 returned error can't find the container with id bf78b5131d29dd92378762cf2f243ebedad686a9316b712281bccfd76c8c042e Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.084602 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-w89kr"] Jan 30 18:41:02 crc kubenswrapper[4782]: W0130 18:41:02.094148 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod616c3ea8_075a_475f_9896_180a02e4cc3f.slice/crio-8e8155586f5aa0ccf86228f88d992ab25211e0a326dde17b3de81a5c64277dfc WatchSource:0}: Error finding container 8e8155586f5aa0ccf86228f88d992ab25211e0a326dde17b3de81a5c64277dfc: Status 404 returned error can't find the container with id 8e8155586f5aa0ccf86228f88d992ab25211e0a326dde17b3de81a5c64277dfc Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.134573 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" event={"ID":"786ed08c-6b06-4e44-aaf4-5562ef433b88","Type":"ContainerStarted","Data":"bf78b5131d29dd92378762cf2f243ebedad686a9316b712281bccfd76c8c042e"} Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.136986 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" event={"ID":"92e82803-8b7d-46f3-ba40-2900590261cf","Type":"ContainerStarted","Data":"49fcac3dac2ce912a224364d880b927e3244ad24771b6be635556316ea83deb8"} Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.138370 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" event={"ID":"616c3ea8-075a-475f-9896-180a02e4cc3f","Type":"ContainerStarted","Data":"8e8155586f5aa0ccf86228f88d992ab25211e0a326dde17b3de81a5c64277dfc"} Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.139712 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" event={"ID":"05098fbb-e910-4fec-8a31-fd98d476b941","Type":"ContainerStarted","Data":"e183fb9b95b9d69692d3669c3ea2440a939c094367c3e7b02f8a3ee60e53d0c5"} Jan 30 18:41:02 crc kubenswrapper[4782]: I0130 18:41:02.186376 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22"] Jan 30 18:41:02 crc kubenswrapper[4782]: W0130 18:41:02.196168 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b18d8a_aa0f_494e_9e56_55bceba788c6.slice/crio-c79ef3eb32bc21c1707fc4490cb3969b677aa55e2a9650fa848ed835dd0b1ee5 WatchSource:0}: Error finding container c79ef3eb32bc21c1707fc4490cb3969b677aa55e2a9650fa848ed835dd0b1ee5: Status 404 returned error can't find the container with id c79ef3eb32bc21c1707fc4490cb3969b677aa55e2a9650fa848ed835dd0b1ee5 Jan 30 18:41:03 crc kubenswrapper[4782]: I0130 18:41:03.153912 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" event={"ID":"19b18d8a-aa0f-494e-9e56-55bceba788c6","Type":"ContainerStarted","Data":"c79ef3eb32bc21c1707fc4490cb3969b677aa55e2a9650fa848ed835dd0b1ee5"} Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.230436 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" event={"ID":"786ed08c-6b06-4e44-aaf4-5562ef433b88","Type":"ContainerStarted","Data":"c029a0f6073cb9dea4ed2062af6e034ee60574641e5bb11afba6155db7ec3a8c"} Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.231728 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.234335 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.235240 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" event={"ID":"19b18d8a-aa0f-494e-9e56-55bceba788c6","Type":"ContainerStarted","Data":"7722e4aa653232fa37b4f09fc5e1b2e70cf07cda092323715cc7188a913aece0"} Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.236702 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" event={"ID":"92e82803-8b7d-46f3-ba40-2900590261cf","Type":"ContainerStarted","Data":"5d0264ab6edf994e41409ea82359c44c418b86a369837538739e522b9c99ff54"} Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.245170 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" event={"ID":"05098fbb-e910-4fec-8a31-fd98d476b941","Type":"ContainerStarted","Data":"711a3b65d49104e5bd74b66c6890d8495fdec53dc375843e841c99084d772d79"} Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.258170 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-pcglc" podStartSLOduration=2.05714749 podStartE2EDuration="11.258144152s" podCreationTimestamp="2026-01-30 18:41:01 +0000 UTC" firstStartedPulling="2026-01-30 18:41:02.067980063 +0000 UTC m=+638.336358088" lastFinishedPulling="2026-01-30 18:41:11.268976715 +0000 UTC m=+647.537354750" observedRunningTime="2026-01-30 18:41:12.253095235 +0000 UTC m=+648.521473260" watchObservedRunningTime="2026-01-30 18:41:12.258144152 +0000 UTC m=+648.526522187" Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.272546 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22" podStartSLOduration=2.2496014 podStartE2EDuration="11.272530873s" podCreationTimestamp="2026-01-30 18:41:01 +0000 UTC" firstStartedPulling="2026-01-30 18:41:02.199062213 +0000 UTC m=+638.467440238" lastFinishedPulling="2026-01-30 18:41:11.221991676 +0000 UTC m=+647.490369711" observedRunningTime="2026-01-30 18:41:12.269877156 +0000 UTC m=+648.538255181" watchObservedRunningTime="2026-01-30 18:41:12.272530873 +0000 UTC m=+648.540908898" Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.347119 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kmgbq" podStartSLOduration=1.9578508669999999 podStartE2EDuration="11.347100865s" podCreationTimestamp="2026-01-30 18:41:01 +0000 UTC" firstStartedPulling="2026-01-30 18:41:01.900182581 +0000 UTC m=+638.168560606" lastFinishedPulling="2026-01-30 18:41:11.289432579 +0000 UTC m=+647.557810604" observedRunningTime="2026-01-30 18:41:12.33335014 +0000 UTC m=+648.601728165" watchObservedRunningTime="2026-01-30 18:41:12.347100865 +0000 UTC m=+648.615478890" Jan 30 18:41:12 crc kubenswrapper[4782]: I0130 18:41:12.371474 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp" podStartSLOduration=2.051477036 podStartE2EDuration="11.371451225s" podCreationTimestamp="2026-01-30 18:41:01 +0000 UTC" firstStartedPulling="2026-01-30 18:41:01.939468097 +0000 UTC m=+638.207846122" lastFinishedPulling="2026-01-30 18:41:11.259442286 +0000 UTC m=+647.527820311" observedRunningTime="2026-01-30 18:41:12.359005754 +0000 UTC m=+648.627383769" watchObservedRunningTime="2026-01-30 18:41:12.371451225 +0000 UTC m=+648.639829250" Jan 30 18:41:13 crc kubenswrapper[4782]: I0130 18:41:13.253056 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" event={"ID":"616c3ea8-075a-475f-9896-180a02e4cc3f","Type":"ContainerStarted","Data":"ec5270d8125024cf17aaeb66827ce6750c30cd3f1e62b1d998b522df21ec6520"} Jan 30 18:41:13 crc kubenswrapper[4782]: I0130 18:41:13.254152 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:13 crc kubenswrapper[4782]: I0130 18:41:13.279626 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" podStartSLOduration=1.5227892280000002 podStartE2EDuration="12.279604468s" podCreationTimestamp="2026-01-30 18:41:01 +0000 UTC" firstStartedPulling="2026-01-30 18:41:02.096746565 +0000 UTC m=+638.365124580" lastFinishedPulling="2026-01-30 18:41:12.853561795 +0000 UTC m=+649.121939820" observedRunningTime="2026-01-30 18:41:13.27809625 +0000 UTC m=+649.546474275" watchObservedRunningTime="2026-01-30 18:41:13.279604468 +0000 UTC m=+649.547982493" Jan 30 18:41:17 crc kubenswrapper[4782]: I0130 18:41:17.511949 4782 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 18:41:21 crc kubenswrapper[4782]: I0130 18:41:21.814392 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-w89kr" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.247930 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn"] Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.250476 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.252887 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.266716 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn"] Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.314884 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.314938 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6c2\" (UniqueName: \"kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.314970 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.415736 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.415789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6c2\" (UniqueName: \"kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.415825 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.416398 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.416432 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.445225 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6c2\" (UniqueName: \"kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:39 crc kubenswrapper[4782]: I0130 18:41:39.575971 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:40 crc kubenswrapper[4782]: I0130 18:41:40.064830 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn"] Jan 30 18:41:40 crc kubenswrapper[4782]: W0130 18:41:40.070621 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104efc3a_1dff_4e45_8448_ea03ec78e23f.slice/crio-fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af WatchSource:0}: Error finding container fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af: Status 404 returned error can't find the container with id fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af Jan 30 18:41:40 crc kubenswrapper[4782]: I0130 18:41:40.422500 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerStarted","Data":"fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af"} Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.187088 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.189222 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.207655 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.251477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.251600 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72qk\" (UniqueName: \"kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.251637 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.352816 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.352897 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72qk\" (UniqueName: \"kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.352925 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.353528 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.353548 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.375701 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72qk\" (UniqueName: \"kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk\") pod \"redhat-operators-znwgp\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.445244 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerStarted","Data":"ea2055d73e6cb0405421f263de6f39144892184460351014f49e3e7f0e122552"} Jan 30 18:41:42 crc kubenswrapper[4782]: I0130 18:41:42.522322 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.041826 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:41:43 crc kubenswrapper[4782]: W0130 18:41:43.052916 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc96ab678_9851_4682_a48b_6e977e283327.slice/crio-4d893b39f039c8a3383951950a90bf053f517059788958da18258d273b319416 WatchSource:0}: Error finding container 4d893b39f039c8a3383951950a90bf053f517059788958da18258d273b319416: Status 404 returned error can't find the container with id 4d893b39f039c8a3383951950a90bf053f517059788958da18258d273b319416 Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.451192 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerStarted","Data":"4d893b39f039c8a3383951950a90bf053f517059788958da18258d273b319416"} Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.574299 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.575271 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.593799 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.676050 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.676092 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.676121 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjzbv\" (UniqueName: \"kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.780136 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.780207 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.780270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjzbv\" (UniqueName: \"kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.780840 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.780866 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.808830 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjzbv\" (UniqueName: \"kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv\") pod \"redhat-marketplace-dm825\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:43 crc kubenswrapper[4782]: I0130 18:41:43.889041 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:44 crc kubenswrapper[4782]: I0130 18:41:44.303899 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:41:44 crc kubenswrapper[4782]: W0130 18:41:44.316090 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7714feb1_9ee4_4b66_a87c_eb7aaa528a33.slice/crio-915794d51ed21ec1a684b4313047b68c19f0600c0961dbdc76c8510173af1014 WatchSource:0}: Error finding container 915794d51ed21ec1a684b4313047b68c19f0600c0961dbdc76c8510173af1014: Status 404 returned error can't find the container with id 915794d51ed21ec1a684b4313047b68c19f0600c0961dbdc76c8510173af1014 Jan 30 18:41:44 crc kubenswrapper[4782]: I0130 18:41:44.461366 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerStarted","Data":"915794d51ed21ec1a684b4313047b68c19f0600c0961dbdc76c8510173af1014"} Jan 30 18:41:44 crc kubenswrapper[4782]: I0130 18:41:44.463681 4782 generic.go:334] "Generic (PLEG): container finished" podID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerID="ea2055d73e6cb0405421f263de6f39144892184460351014f49e3e7f0e122552" exitCode=0 Jan 30 18:41:44 crc kubenswrapper[4782]: I0130 18:41:44.463713 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerDied","Data":"ea2055d73e6cb0405421f263de6f39144892184460351014f49e3e7f0e122552"} Jan 30 18:41:46 crc kubenswrapper[4782]: I0130 18:41:46.478708 4782 generic.go:334] "Generic (PLEG): container finished" podID="c96ab678-9851-4682-a48b-6e977e283327" containerID="8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c" exitCode=0 Jan 30 18:41:46 crc kubenswrapper[4782]: I0130 18:41:46.478778 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerDied","Data":"8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c"} Jan 30 18:41:46 crc kubenswrapper[4782]: I0130 18:41:46.483864 4782 generic.go:334] "Generic (PLEG): container finished" podID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerID="3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7" exitCode=0 Jan 30 18:41:46 crc kubenswrapper[4782]: I0130 18:41:46.483928 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerDied","Data":"3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7"} Jan 30 18:41:47 crc kubenswrapper[4782]: I0130 18:41:47.495434 4782 generic.go:334] "Generic (PLEG): container finished" podID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerID="1ca5dcd3ac32700efa6dcdbd8aec455bef12911fd6bb722e4ce5e81d4ba5f1db" exitCode=0 Jan 30 18:41:47 crc kubenswrapper[4782]: I0130 18:41:47.495482 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerDied","Data":"1ca5dcd3ac32700efa6dcdbd8aec455bef12911fd6bb722e4ce5e81d4ba5f1db"} Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.504152 4782 generic.go:334] "Generic (PLEG): container finished" podID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerID="70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d" exitCode=0 Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.504225 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerDied","Data":"70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d"} Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.508692 4782 generic.go:334] "Generic (PLEG): container finished" podID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerID="9a4a339c1c5259e25656b5d57198d7a7e1dd9592279dec3ac381ee3f5ff281f6" exitCode=0 Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.508784 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerDied","Data":"9a4a339c1c5259e25656b5d57198d7a7e1dd9592279dec3ac381ee3f5ff281f6"} Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.513134 4782 generic.go:334] "Generic (PLEG): container finished" podID="c96ab678-9851-4682-a48b-6e977e283327" containerID="5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159" exitCode=0 Jan 30 18:41:48 crc kubenswrapper[4782]: I0130 18:41:48.513180 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerDied","Data":"5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159"} Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.521241 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerStarted","Data":"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6"} Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.523888 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerStarted","Data":"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f"} Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.544647 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-znwgp" podStartSLOduration=5.053974473 podStartE2EDuration="7.544620934s" podCreationTimestamp="2026-01-30 18:41:42 +0000 UTC" firstStartedPulling="2026-01-30 18:41:46.499387552 +0000 UTC m=+682.767765577" lastFinishedPulling="2026-01-30 18:41:48.990034003 +0000 UTC m=+685.258412038" observedRunningTime="2026-01-30 18:41:49.541443505 +0000 UTC m=+685.809821590" watchObservedRunningTime="2026-01-30 18:41:49.544620934 +0000 UTC m=+685.812998959" Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.570194 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dm825" podStartSLOduration=4.17187117 podStartE2EDuration="6.570168656s" podCreationTimestamp="2026-01-30 18:41:43 +0000 UTC" firstStartedPulling="2026-01-30 18:41:46.500718955 +0000 UTC m=+682.769096980" lastFinishedPulling="2026-01-30 18:41:48.899016441 +0000 UTC m=+685.167394466" observedRunningTime="2026-01-30 18:41:49.565122501 +0000 UTC m=+685.833500556" watchObservedRunningTime="2026-01-30 18:41:49.570168656 +0000 UTC m=+685.838546701" Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.852915 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.962331 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util\") pod \"104efc3a-1dff-4e45-8448-ea03ec78e23f\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.962526 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle\") pod \"104efc3a-1dff-4e45-8448-ea03ec78e23f\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.962570 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb6c2\" (UniqueName: \"kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2\") pod \"104efc3a-1dff-4e45-8448-ea03ec78e23f\" (UID: \"104efc3a-1dff-4e45-8448-ea03ec78e23f\") " Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.963178 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle" (OuterVolumeSpecName: "bundle") pod "104efc3a-1dff-4e45-8448-ea03ec78e23f" (UID: "104efc3a-1dff-4e45-8448-ea03ec78e23f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.969398 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2" (OuterVolumeSpecName: "kube-api-access-cb6c2") pod "104efc3a-1dff-4e45-8448-ea03ec78e23f" (UID: "104efc3a-1dff-4e45-8448-ea03ec78e23f"). InnerVolumeSpecName "kube-api-access-cb6c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:41:49 crc kubenswrapper[4782]: I0130 18:41:49.983398 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util" (OuterVolumeSpecName: "util") pod "104efc3a-1dff-4e45-8448-ea03ec78e23f" (UID: "104efc3a-1dff-4e45-8448-ea03ec78e23f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.064473 4782 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.064523 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb6c2\" (UniqueName: \"kubernetes.io/projected/104efc3a-1dff-4e45-8448-ea03ec78e23f-kube-api-access-cb6c2\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.064538 4782 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/104efc3a-1dff-4e45-8448-ea03ec78e23f-util\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.533498 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" event={"ID":"104efc3a-1dff-4e45-8448-ea03ec78e23f","Type":"ContainerDied","Data":"fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af"} Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.533578 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fec77fe188958a69b7f3e342ca6aeb90f31405ce65d37204664689e7475390af" Jan 30 18:41:50 crc kubenswrapper[4782]: I0130 18:41:50.533677 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.937303 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-prp59"] Jan 30 18:41:51 crc kubenswrapper[4782]: E0130 18:41:51.937525 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="util" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.937536 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="util" Jan 30 18:41:51 crc kubenswrapper[4782]: E0130 18:41:51.937544 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="extract" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.937550 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="extract" Jan 30 18:41:51 crc kubenswrapper[4782]: E0130 18:41:51.937560 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="pull" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.937568 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="pull" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.937664 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="104efc3a-1dff-4e45-8448-ea03ec78e23f" containerName="extract" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.938093 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-prp59" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.940867 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.940898 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mbvqv" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.941024 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 18:41:51 crc kubenswrapper[4782]: I0130 18:41:51.953446 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-prp59"] Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.089413 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72zfj\" (UniqueName: \"kubernetes.io/projected/37a25f92-459c-447c-846b-bfd73a950907-kube-api-access-72zfj\") pod \"nmstate-operator-646758c888-prp59\" (UID: \"37a25f92-459c-447c-846b-bfd73a950907\") " pod="openshift-nmstate/nmstate-operator-646758c888-prp59" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.190291 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72zfj\" (UniqueName: \"kubernetes.io/projected/37a25f92-459c-447c-846b-bfd73a950907-kube-api-access-72zfj\") pod \"nmstate-operator-646758c888-prp59\" (UID: \"37a25f92-459c-447c-846b-bfd73a950907\") " pod="openshift-nmstate/nmstate-operator-646758c888-prp59" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.213067 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72zfj\" (UniqueName: \"kubernetes.io/projected/37a25f92-459c-447c-846b-bfd73a950907-kube-api-access-72zfj\") pod \"nmstate-operator-646758c888-prp59\" (UID: \"37a25f92-459c-447c-846b-bfd73a950907\") " pod="openshift-nmstate/nmstate-operator-646758c888-prp59" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.252696 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-prp59" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.505017 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-prp59"] Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.523343 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.523501 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:41:52 crc kubenswrapper[4782]: I0130 18:41:52.550357 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-prp59" event={"ID":"37a25f92-459c-447c-846b-bfd73a950907","Type":"ContainerStarted","Data":"bb78e6fad47cbb6d0a55ad268f4f7d1c1afa035fb76319798fac3b2d53b279bd"} Jan 30 18:41:53 crc kubenswrapper[4782]: I0130 18:41:53.559458 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-znwgp" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="registry-server" probeResult="failure" output=< Jan 30 18:41:53 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 18:41:53 crc kubenswrapper[4782]: > Jan 30 18:41:53 crc kubenswrapper[4782]: I0130 18:41:53.889275 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:53 crc kubenswrapper[4782]: I0130 18:41:53.889894 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:53 crc kubenswrapper[4782]: I0130 18:41:53.947477 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:54 crc kubenswrapper[4782]: I0130 18:41:54.607121 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:55 crc kubenswrapper[4782]: I0130 18:41:55.765999 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:41:56 crc kubenswrapper[4782]: I0130 18:41:56.578512 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-prp59" event={"ID":"37a25f92-459c-447c-846b-bfd73a950907","Type":"ContainerStarted","Data":"4028742e30019ecb9868b5750d220de29a59c9a40684d2d8161eb405d0fa4a6c"} Jan 30 18:41:56 crc kubenswrapper[4782]: I0130 18:41:56.605572 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-prp59" podStartSLOduration=2.152502036 podStartE2EDuration="5.605548258s" podCreationTimestamp="2026-01-30 18:41:51 +0000 UTC" firstStartedPulling="2026-01-30 18:41:52.519239428 +0000 UTC m=+688.787617453" lastFinishedPulling="2026-01-30 18:41:55.97228565 +0000 UTC m=+692.240663675" observedRunningTime="2026-01-30 18:41:56.604090082 +0000 UTC m=+692.872468147" watchObservedRunningTime="2026-01-30 18:41:56.605548258 +0000 UTC m=+692.873926293" Jan 30 18:41:57 crc kubenswrapper[4782]: I0130 18:41:57.586460 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dm825" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="registry-server" containerID="cri-o://bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f" gracePeriod=2 Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.032178 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.165261 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities\") pod \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.165368 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjzbv\" (UniqueName: \"kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv\") pod \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.165504 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content\") pod \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\" (UID: \"7714feb1-9ee4-4b66-a87c-eb7aaa528a33\") " Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.166698 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities" (OuterVolumeSpecName: "utilities") pod "7714feb1-9ee4-4b66-a87c-eb7aaa528a33" (UID: "7714feb1-9ee4-4b66-a87c-eb7aaa528a33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.172982 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv" (OuterVolumeSpecName: "kube-api-access-wjzbv") pod "7714feb1-9ee4-4b66-a87c-eb7aaa528a33" (UID: "7714feb1-9ee4-4b66-a87c-eb7aaa528a33"). InnerVolumeSpecName "kube-api-access-wjzbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.267668 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.267746 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjzbv\" (UniqueName: \"kubernetes.io/projected/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-kube-api-access-wjzbv\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.597910 4782 generic.go:334] "Generic (PLEG): container finished" podID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerID="bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f" exitCode=0 Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.597992 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerDied","Data":"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f"} Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.598038 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dm825" event={"ID":"7714feb1-9ee4-4b66-a87c-eb7aaa528a33","Type":"ContainerDied","Data":"915794d51ed21ec1a684b4313047b68c19f0600c0961dbdc76c8510173af1014"} Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.598069 4782 scope.go:117] "RemoveContainer" containerID="bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.599668 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dm825" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.628536 4782 scope.go:117] "RemoveContainer" containerID="70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.644725 4782 scope.go:117] "RemoveContainer" containerID="3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.675789 4782 scope.go:117] "RemoveContainer" containerID="bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f" Jan 30 18:41:58 crc kubenswrapper[4782]: E0130 18:41:58.676499 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f\": container with ID starting with bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f not found: ID does not exist" containerID="bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.676547 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f"} err="failed to get container status \"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f\": rpc error: code = NotFound desc = could not find container \"bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f\": container with ID starting with bddcd314392b1e5689c0216c39713089ea14797dc16849ff39f5b8b17cf4fa4f not found: ID does not exist" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.676579 4782 scope.go:117] "RemoveContainer" containerID="70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d" Jan 30 18:41:58 crc kubenswrapper[4782]: E0130 18:41:58.677076 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d\": container with ID starting with 70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d not found: ID does not exist" containerID="70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.677121 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d"} err="failed to get container status \"70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d\": rpc error: code = NotFound desc = could not find container \"70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d\": container with ID starting with 70e2bf913c3f799b5d334b2e029610b88dcb46e39c24eedc2cd70d360fccb28d not found: ID does not exist" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.677138 4782 scope.go:117] "RemoveContainer" containerID="3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7" Jan 30 18:41:58 crc kubenswrapper[4782]: E0130 18:41:58.677533 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7\": container with ID starting with 3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7 not found: ID does not exist" containerID="3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7" Jan 30 18:41:58 crc kubenswrapper[4782]: I0130 18:41:58.677574 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7"} err="failed to get container status \"3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7\": rpc error: code = NotFound desc = could not find container \"3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7\": container with ID starting with 3c1ebe0f7c496bae04e8eac003f14cfee063718d87b00483dc231a5294e94af7 not found: ID does not exist" Jan 30 18:41:59 crc kubenswrapper[4782]: I0130 18:41:59.218319 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7714feb1-9ee4-4b66-a87c-eb7aaa528a33" (UID: "7714feb1-9ee4-4b66-a87c-eb7aaa528a33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:41:59 crc kubenswrapper[4782]: I0130 18:41:59.284515 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7714feb1-9ee4-4b66-a87c-eb7aaa528a33-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:41:59 crc kubenswrapper[4782]: I0130 18:41:59.544040 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:41:59 crc kubenswrapper[4782]: I0130 18:41:59.552267 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dm825"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.417083 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" path="/var/lib/kubelet/pods/7714feb1-9ee4-4b66-a87c-eb7aaa528a33/volumes" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.525742 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5bs9t"] Jan 30 18:42:00 crc kubenswrapper[4782]: E0130 18:42:00.525959 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="extract-content" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.525970 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="extract-content" Jan 30 18:42:00 crc kubenswrapper[4782]: E0130 18:42:00.525979 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="extract-utilities" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.525985 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="extract-utilities" Jan 30 18:42:00 crc kubenswrapper[4782]: E0130 18:42:00.525993 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="registry-server" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.525999 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="registry-server" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.526099 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7714feb1-9ee4-4b66-a87c-eb7aaa528a33" containerName="registry-server" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.526656 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.530685 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vnzn7" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.534493 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.535130 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.537432 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.540216 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5bs9t"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.562351 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-2blvc"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.563619 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.566860 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.601557 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkk9m\" (UniqueName: \"kubernetes.io/projected/8bab1b5d-f025-4df0-ba3c-d406621dd5ac-kube-api-access-nkk9m\") pod \"nmstate-metrics-54757c584b-5bs9t\" (UID: \"8bab1b5d-f025-4df0-ba3c-d406621dd5ac\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.601619 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5wp\" (UniqueName: \"kubernetes.io/projected/7022f3b6-d4c1-4b83-b541-2125a53e701c-kube-api-access-xb5wp\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.601642 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.676689 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.677521 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.680429 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.681270 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gbcnv" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.681586 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.684637 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705078 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8wds\" (UniqueName: \"kubernetes.io/projected/61543235-f4f6-4320-b2ef-11521d91d360-kube-api-access-j8wds\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705312 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705387 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5wp\" (UniqueName: \"kubernetes.io/projected/7022f3b6-d4c1-4b83-b541-2125a53e701c-kube-api-access-xb5wp\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705516 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-dbus-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705594 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-nmstate-lock\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705672 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkk9m\" (UniqueName: \"kubernetes.io/projected/8bab1b5d-f025-4df0-ba3c-d406621dd5ac-kube-api-access-nkk9m\") pod \"nmstate-metrics-54757c584b-5bs9t\" (UID: \"8bab1b5d-f025-4df0-ba3c-d406621dd5ac\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.705750 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-ovs-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: E0130 18:42:00.705945 4782 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 30 18:42:00 crc kubenswrapper[4782]: E0130 18:42:00.706053 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair podName:7022f3b6-d4c1-4b83-b541-2125a53e701c nodeName:}" failed. No retries permitted until 2026-01-30 18:42:01.206036107 +0000 UTC m=+697.474414132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-hsdpj" (UID: "7022f3b6-d4c1-4b83-b541-2125a53e701c") : secret "openshift-nmstate-webhook" not found Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.728671 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkk9m\" (UniqueName: \"kubernetes.io/projected/8bab1b5d-f025-4df0-ba3c-d406621dd5ac-kube-api-access-nkk9m\") pod \"nmstate-metrics-54757c584b-5bs9t\" (UID: \"8bab1b5d-f025-4df0-ba3c-d406621dd5ac\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.731633 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5wp\" (UniqueName: \"kubernetes.io/projected/7022f3b6-d4c1-4b83-b541-2125a53e701c-kube-api-access-xb5wp\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807416 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wds\" (UniqueName: \"kubernetes.io/projected/61543235-f4f6-4320-b2ef-11521d91d360-kube-api-access-j8wds\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807484 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42d8b05a-8142-462f-b3ad-e496c30e8eea-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-dbus-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807535 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d8b05a-8142-462f-b3ad-e496c30e8eea-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807550 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87zc\" (UniqueName: \"kubernetes.io/projected/42d8b05a-8142-462f-b3ad-e496c30e8eea-kube-api-access-t87zc\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807567 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-nmstate-lock\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807597 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-ovs-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.807653 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-ovs-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.808113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-nmstate-lock\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.808119 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/61543235-f4f6-4320-b2ef-11521d91d360-dbus-socket\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.829000 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8wds\" (UniqueName: \"kubernetes.io/projected/61543235-f4f6-4320-b2ef-11521d91d360-kube-api-access-j8wds\") pod \"nmstate-handler-2blvc\" (UID: \"61543235-f4f6-4320-b2ef-11521d91d360\") " pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.829848 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-664cc6565-r2qz5"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.830717 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.841777 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664cc6565-r2qz5"] Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.845456 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.882431 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908296 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxgx\" (UniqueName: \"kubernetes.io/projected/987697e6-a7fc-4f7d-b232-820da4f3176d-kube-api-access-gmxgx\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908635 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-oauth-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908669 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-oauth-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908692 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-console-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908718 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42d8b05a-8142-462f-b3ad-e496c30e8eea-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908744 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-trusted-ca-bundle\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908759 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908781 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-service-ca\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908798 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d8b05a-8142-462f-b3ad-e496c30e8eea-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.908814 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87zc\" (UniqueName: \"kubernetes.io/projected/42d8b05a-8142-462f-b3ad-e496c30e8eea-kube-api-access-t87zc\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.909912 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/42d8b05a-8142-462f-b3ad-e496c30e8eea-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.912609 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/42d8b05a-8142-462f-b3ad-e496c30e8eea-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.925791 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87zc\" (UniqueName: \"kubernetes.io/projected/42d8b05a-8142-462f-b3ad-e496c30e8eea-kube-api-access-t87zc\") pod \"nmstate-console-plugin-7754f76f8b-z6fgp\" (UID: \"42d8b05a-8142-462f-b3ad-e496c30e8eea\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:00 crc kubenswrapper[4782]: I0130 18:42:00.996031 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010401 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-oauth-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010440 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-console-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010475 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-trusted-ca-bundle\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010513 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-service-ca\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010550 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxgx\" (UniqueName: \"kubernetes.io/projected/987697e6-a7fc-4f7d-b232-820da4f3176d-kube-api-access-gmxgx\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.010581 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-oauth-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.012510 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-service-ca\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.013048 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-oauth-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.013385 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-trusted-ca-bundle\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.013484 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/987697e6-a7fc-4f7d-b232-820da4f3176d-console-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.014971 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-oauth-config\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.016106 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/987697e6-a7fc-4f7d-b232-820da4f3176d-console-serving-cert\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.032214 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxgx\" (UniqueName: \"kubernetes.io/projected/987697e6-a7fc-4f7d-b232-820da4f3176d-kube-api-access-gmxgx\") pod \"console-664cc6565-r2qz5\" (UID: \"987697e6-a7fc-4f7d-b232-820da4f3176d\") " pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.171824 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp"] Jan 30 18:42:01 crc kubenswrapper[4782]: W0130 18:42:01.175670 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d8b05a_8142_462f_b3ad_e496c30e8eea.slice/crio-592315ab8bb432c1ff9a1bf190dea9054651eae38b0461ef9cb64b30cfa63a59 WatchSource:0}: Error finding container 592315ab8bb432c1ff9a1bf190dea9054651eae38b0461ef9cb64b30cfa63a59: Status 404 returned error can't find the container with id 592315ab8bb432c1ff9a1bf190dea9054651eae38b0461ef9cb64b30cfa63a59 Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.213674 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.217831 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7022f3b6-d4c1-4b83-b541-2125a53e701c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hsdpj\" (UID: \"7022f3b6-d4c1-4b83-b541-2125a53e701c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.232601 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.295899 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-5bs9t"] Jan 30 18:42:01 crc kubenswrapper[4782]: W0130 18:42:01.305432 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bab1b5d_f025_4df0_ba3c_d406621dd5ac.slice/crio-e7a7bfb328db96dbc3b49d81ad2abfb1d679caa8074b0f9a2712e6ca08a84b21 WatchSource:0}: Error finding container e7a7bfb328db96dbc3b49d81ad2abfb1d679caa8074b0f9a2712e6ca08a84b21: Status 404 returned error can't find the container with id e7a7bfb328db96dbc3b49d81ad2abfb1d679caa8074b0f9a2712e6ca08a84b21 Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.451400 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.628517 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2blvc" event={"ID":"61543235-f4f6-4320-b2ef-11521d91d360","Type":"ContainerStarted","Data":"5b564799c2b2a828bd1cfc68762c3d9b415facc79f272aa3ab01dac62eea13ca"} Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.630063 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" event={"ID":"42d8b05a-8142-462f-b3ad-e496c30e8eea","Type":"ContainerStarted","Data":"592315ab8bb432c1ff9a1bf190dea9054651eae38b0461ef9cb64b30cfa63a59"} Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.631305 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" event={"ID":"8bab1b5d-f025-4df0-ba3c-d406621dd5ac","Type":"ContainerStarted","Data":"e7a7bfb328db96dbc3b49d81ad2abfb1d679caa8074b0f9a2712e6ca08a84b21"} Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.632595 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-664cc6565-r2qz5"] Jan 30 18:42:01 crc kubenswrapper[4782]: W0130 18:42:01.633270 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7022f3b6_d4c1_4b83_b541_2125a53e701c.slice/crio-900d44ab098335780868bec9c1d8acfe55b41ab1f4bd7cd7774acbe1b845f8d1 WatchSource:0}: Error finding container 900d44ab098335780868bec9c1d8acfe55b41ab1f4bd7cd7774acbe1b845f8d1: Status 404 returned error can't find the container with id 900d44ab098335780868bec9c1d8acfe55b41ab1f4bd7cd7774acbe1b845f8d1 Jan 30 18:42:01 crc kubenswrapper[4782]: W0130 18:42:01.638181 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987697e6_a7fc_4f7d_b232_820da4f3176d.slice/crio-0f59d536a0ba0387bc5e6e46e78f3fa481f7efa5ea485a115ca945807b7a2edf WatchSource:0}: Error finding container 0f59d536a0ba0387bc5e6e46e78f3fa481f7efa5ea485a115ca945807b7a2edf: Status 404 returned error can't find the container with id 0f59d536a0ba0387bc5e6e46e78f3fa481f7efa5ea485a115ca945807b7a2edf Jan 30 18:42:01 crc kubenswrapper[4782]: I0130 18:42:01.638786 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj"] Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.579040 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.638935 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664cc6565-r2qz5" event={"ID":"987697e6-a7fc-4f7d-b232-820da4f3176d","Type":"ContainerStarted","Data":"308faa754d79527a8cd5f041b78a548d695e8be053e73ef6006a7c13cf7f2beb"} Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.639245 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-664cc6565-r2qz5" event={"ID":"987697e6-a7fc-4f7d-b232-820da4f3176d","Type":"ContainerStarted","Data":"0f59d536a0ba0387bc5e6e46e78f3fa481f7efa5ea485a115ca945807b7a2edf"} Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.640204 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" event={"ID":"7022f3b6-d4c1-4b83-b541-2125a53e701c","Type":"ContainerStarted","Data":"900d44ab098335780868bec9c1d8acfe55b41ab1f4bd7cd7774acbe1b845f8d1"} Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.641963 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.657340 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-664cc6565-r2qz5" podStartSLOduration=2.6573250440000002 podStartE2EDuration="2.657325044s" podCreationTimestamp="2026-01-30 18:42:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:42:02.656452032 +0000 UTC m=+698.924830077" watchObservedRunningTime="2026-01-30 18:42:02.657325044 +0000 UTC m=+698.925703069" Jan 30 18:42:02 crc kubenswrapper[4782]: I0130 18:42:02.818841 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:42:03 crc kubenswrapper[4782]: I0130 18:42:03.647634 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-znwgp" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="registry-server" containerID="cri-o://5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6" gracePeriod=2 Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.350518 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.466667 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content\") pod \"c96ab678-9851-4682-a48b-6e977e283327\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.467095 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h72qk\" (UniqueName: \"kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk\") pod \"c96ab678-9851-4682-a48b-6e977e283327\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.467125 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities\") pod \"c96ab678-9851-4682-a48b-6e977e283327\" (UID: \"c96ab678-9851-4682-a48b-6e977e283327\") " Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.468030 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities" (OuterVolumeSpecName: "utilities") pod "c96ab678-9851-4682-a48b-6e977e283327" (UID: "c96ab678-9851-4682-a48b-6e977e283327"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.477054 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk" (OuterVolumeSpecName: "kube-api-access-h72qk") pod "c96ab678-9851-4682-a48b-6e977e283327" (UID: "c96ab678-9851-4682-a48b-6e977e283327"). InnerVolumeSpecName "kube-api-access-h72qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.568317 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h72qk\" (UniqueName: \"kubernetes.io/projected/c96ab678-9851-4682-a48b-6e977e283327-kube-api-access-h72qk\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.568356 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.577732 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c96ab678-9851-4682-a48b-6e977e283327" (UID: "c96ab678-9851-4682-a48b-6e977e283327"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.656392 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" event={"ID":"8bab1b5d-f025-4df0-ba3c-d406621dd5ac","Type":"ContainerStarted","Data":"6080a0b0824be87c91c1182cba2de4f38f563b1a573fc7f750e09a1dd96ece8e"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.658713 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2blvc" event={"ID":"61543235-f4f6-4320-b2ef-11521d91d360","Type":"ContainerStarted","Data":"93155945acd0dce059b66813e05ffdac0eadfe088dc905850273ec0d7bb85c24"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.658888 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.660046 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" event={"ID":"42d8b05a-8142-462f-b3ad-e496c30e8eea","Type":"ContainerStarted","Data":"373c85910707879e7b8a0dd38288362daed2b1d3e1029be75e3125507bee16f8"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.661876 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" event={"ID":"7022f3b6-d4c1-4b83-b541-2125a53e701c","Type":"ContainerStarted","Data":"cfe2dc4e9854cc6ada971ac9f2fd7046b692ecb2500f52561ed8ad55abd99680"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.662007 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.665547 4782 generic.go:334] "Generic (PLEG): container finished" podID="c96ab678-9851-4682-a48b-6e977e283327" containerID="5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6" exitCode=0 Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.665606 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerDied","Data":"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.665632 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znwgp" event={"ID":"c96ab678-9851-4682-a48b-6e977e283327","Type":"ContainerDied","Data":"4d893b39f039c8a3383951950a90bf053f517059788958da18258d273b319416"} Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.665649 4782 scope.go:117] "RemoveContainer" containerID="5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.665649 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znwgp" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.669901 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96ab678-9851-4682-a48b-6e977e283327-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.684697 4782 scope.go:117] "RemoveContainer" containerID="5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.697103 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-2blvc" podStartSLOduration=1.454206748 podStartE2EDuration="4.697075729s" podCreationTimestamp="2026-01-30 18:42:00 +0000 UTC" firstStartedPulling="2026-01-30 18:42:00.920883283 +0000 UTC m=+697.189261308" lastFinishedPulling="2026-01-30 18:42:04.163752224 +0000 UTC m=+700.432130289" observedRunningTime="2026-01-30 18:42:04.680210922 +0000 UTC m=+700.948588987" watchObservedRunningTime="2026-01-30 18:42:04.697075729 +0000 UTC m=+700.965453754" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.711706 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" podStartSLOduration=2.18102002 podStartE2EDuration="4.711676161s" podCreationTimestamp="2026-01-30 18:42:00 +0000 UTC" firstStartedPulling="2026-01-30 18:42:01.636955699 +0000 UTC m=+697.905333724" lastFinishedPulling="2026-01-30 18:42:04.1676118 +0000 UTC m=+700.435989865" observedRunningTime="2026-01-30 18:42:04.702949445 +0000 UTC m=+700.971327490" watchObservedRunningTime="2026-01-30 18:42:04.711676161 +0000 UTC m=+700.980054206" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.744453 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z6fgp" podStartSLOduration=1.7583688830000002 podStartE2EDuration="4.744427531s" podCreationTimestamp="2026-01-30 18:42:00 +0000 UTC" firstStartedPulling="2026-01-30 18:42:01.177913872 +0000 UTC m=+697.446291897" lastFinishedPulling="2026-01-30 18:42:04.16397251 +0000 UTC m=+700.432350545" observedRunningTime="2026-01-30 18:42:04.736567016 +0000 UTC m=+701.004945031" watchObservedRunningTime="2026-01-30 18:42:04.744427531 +0000 UTC m=+701.012805596" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.759855 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.764763 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-znwgp"] Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.800485 4782 scope.go:117] "RemoveContainer" containerID="8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.824764 4782 scope.go:117] "RemoveContainer" containerID="5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6" Jan 30 18:42:04 crc kubenswrapper[4782]: E0130 18:42:04.825037 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6\": container with ID starting with 5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6 not found: ID does not exist" containerID="5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.825064 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6"} err="failed to get container status \"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6\": rpc error: code = NotFound desc = could not find container \"5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6\": container with ID starting with 5ae50b93cb0eeb269213e5fb1faa535f252857601545554c6f636d9c6bbfc3c6 not found: ID does not exist" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.825085 4782 scope.go:117] "RemoveContainer" containerID="5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159" Jan 30 18:42:04 crc kubenswrapper[4782]: E0130 18:42:04.825442 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159\": container with ID starting with 5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159 not found: ID does not exist" containerID="5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.825489 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159"} err="failed to get container status \"5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159\": rpc error: code = NotFound desc = could not find container \"5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159\": container with ID starting with 5f55e186dd6a5e1017138e605138e841291947d5c76c5cc1487e14b488b32159 not found: ID does not exist" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.825522 4782 scope.go:117] "RemoveContainer" containerID="8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c" Jan 30 18:42:04 crc kubenswrapper[4782]: E0130 18:42:04.826009 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c\": container with ID starting with 8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c not found: ID does not exist" containerID="8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c" Jan 30 18:42:04 crc kubenswrapper[4782]: I0130 18:42:04.826172 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c"} err="failed to get container status \"8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c\": rpc error: code = NotFound desc = could not find container \"8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c\": container with ID starting with 8f2c1db424ae6f68b0637a0fcc4dd1d694baceb1bbc5f05af67191499ed7e20c not found: ID does not exist" Jan 30 18:42:06 crc kubenswrapper[4782]: I0130 18:42:06.418849 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96ab678-9851-4682-a48b-6e977e283327" path="/var/lib/kubelet/pods/c96ab678-9851-4682-a48b-6e977e283327/volumes" Jan 30 18:42:07 crc kubenswrapper[4782]: I0130 18:42:07.702298 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" event={"ID":"8bab1b5d-f025-4df0-ba3c-d406621dd5ac","Type":"ContainerStarted","Data":"2b4364e1a2fe898654cadef4214c4cd791ff130db9d3847a0dde0be683d0cdb4"} Jan 30 18:42:07 crc kubenswrapper[4782]: I0130 18:42:07.744453 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-5bs9t" podStartSLOduration=2.372278462 podStartE2EDuration="7.744413403s" podCreationTimestamp="2026-01-30 18:42:00 +0000 UTC" firstStartedPulling="2026-01-30 18:42:01.309275742 +0000 UTC m=+697.577653767" lastFinishedPulling="2026-01-30 18:42:06.681410643 +0000 UTC m=+702.949788708" observedRunningTime="2026-01-30 18:42:07.723469095 +0000 UTC m=+703.991847120" watchObservedRunningTime="2026-01-30 18:42:07.744413403 +0000 UTC m=+704.012791428" Jan 30 18:42:10 crc kubenswrapper[4782]: I0130 18:42:10.909628 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-2blvc" Jan 30 18:42:11 crc kubenswrapper[4782]: I0130 18:42:11.232755 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:11 crc kubenswrapper[4782]: I0130 18:42:11.232825 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:11 crc kubenswrapper[4782]: I0130 18:42:11.243853 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:11 crc kubenswrapper[4782]: I0130 18:42:11.735765 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-664cc6565-r2qz5" Jan 30 18:42:11 crc kubenswrapper[4782]: I0130 18:42:11.786002 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:42:21 crc kubenswrapper[4782]: I0130 18:42:21.462298 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hsdpj" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.842634 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hcttm" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" containerID="cri-o://307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd" gracePeriod=15 Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.901843 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m"] Jan 30 18:42:36 crc kubenswrapper[4782]: E0130 18:42:36.902330 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="registry-server" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.902342 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="registry-server" Jan 30 18:42:36 crc kubenswrapper[4782]: E0130 18:42:36.902355 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="extract-content" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.902361 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="extract-content" Jan 30 18:42:36 crc kubenswrapper[4782]: E0130 18:42:36.902379 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="extract-utilities" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.902386 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="extract-utilities" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.902474 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96ab678-9851-4682-a48b-6e977e283327" containerName="registry-server" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.903247 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.904823 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.922850 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m"] Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.998252 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.998371 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcgqz\" (UniqueName: \"kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:36 crc kubenswrapper[4782]: I0130 18:42:36.998416 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.099361 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.099447 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcgqz\" (UniqueName: \"kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.099478 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.099977 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.100169 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.119692 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcgqz\" (UniqueName: \"kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.218532 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hcttm_22efd41f-5357-4820-afa4-09733ef60db0/console/0.log" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.218592 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302401 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302447 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302494 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnkvj\" (UniqueName: \"kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302544 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302565 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302620 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.302687 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert\") pod \"22efd41f-5357-4820-afa4-09733ef60db0\" (UID: \"22efd41f-5357-4820-afa4-09733ef60db0\") " Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.303404 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.303579 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.303945 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config" (OuterVolumeSpecName: "console-config") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.304448 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca" (OuterVolumeSpecName: "service-ca") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.306643 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.310246 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.310337 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj" (OuterVolumeSpecName: "kube-api-access-mnkvj") pod "22efd41f-5357-4820-afa4-09733ef60db0" (UID: "22efd41f-5357-4820-afa4-09733ef60db0"). InnerVolumeSpecName "kube-api-access-mnkvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.311454 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404387 4782 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404429 4782 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404448 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnkvj\" (UniqueName: \"kubernetes.io/projected/22efd41f-5357-4820-afa4-09733ef60db0-kube-api-access-mnkvj\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404463 4782 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404479 4782 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22efd41f-5357-4820-afa4-09733ef60db0-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404494 4782 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.404507 4782 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22efd41f-5357-4820-afa4-09733ef60db0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.604590 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m"] Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.933766 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hcttm_22efd41f-5357-4820-afa4-09733ef60db0/console/0.log" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.933826 4782 generic.go:334] "Generic (PLEG): container finished" podID="22efd41f-5357-4820-afa4-09733ef60db0" containerID="307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd" exitCode=2 Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.933911 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcttm" event={"ID":"22efd41f-5357-4820-afa4-09733ef60db0","Type":"ContainerDied","Data":"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd"} Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.933936 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcttm" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.933993 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcttm" event={"ID":"22efd41f-5357-4820-afa4-09733ef60db0","Type":"ContainerDied","Data":"a035f06032de2eb5afd1078c88dae055d70bbe165679851dc18a62f3ea624e8a"} Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.934019 4782 scope.go:117] "RemoveContainer" containerID="307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.935944 4782 generic.go:334] "Generic (PLEG): container finished" podID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerID="5afdaa49a73b9211d3c016e115294c3bd80c5f333772e98f49d2281cf5c0d66f" exitCode=0 Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.935968 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" event={"ID":"7e835e31-1014-43e6-8bb6-34ad5fa00bab","Type":"ContainerDied","Data":"5afdaa49a73b9211d3c016e115294c3bd80c5f333772e98f49d2281cf5c0d66f"} Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.935986 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" event={"ID":"7e835e31-1014-43e6-8bb6-34ad5fa00bab","Type":"ContainerStarted","Data":"7a0ce6f64971a3683ccc15926b5434538da099ec0d46512f72c3b027b5fd7114"} Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.957956 4782 scope.go:117] "RemoveContainer" containerID="307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd" Jan 30 18:42:37 crc kubenswrapper[4782]: E0130 18:42:37.958469 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd\": container with ID starting with 307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd not found: ID does not exist" containerID="307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.958519 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd"} err="failed to get container status \"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd\": rpc error: code = NotFound desc = could not find container \"307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd\": container with ID starting with 307a13c2a4cc0953002b2567451aaf507546db8d8edc018ed99eae7551ff12dd not found: ID does not exist" Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.987564 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:42:37 crc kubenswrapper[4782]: I0130 18:42:37.994799 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hcttm"] Jan 30 18:42:38 crc kubenswrapper[4782]: I0130 18:42:38.424672 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22efd41f-5357-4820-afa4-09733ef60db0" path="/var/lib/kubelet/pods/22efd41f-5357-4820-afa4-09733ef60db0/volumes" Jan 30 18:42:39 crc kubenswrapper[4782]: I0130 18:42:39.970144 4782 generic.go:334] "Generic (PLEG): container finished" podID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerID="5582fae59151e7650811456c32b253cb4ceb94b4d14ea6349baff1779623c51c" exitCode=0 Jan 30 18:42:39 crc kubenswrapper[4782]: I0130 18:42:39.970259 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" event={"ID":"7e835e31-1014-43e6-8bb6-34ad5fa00bab","Type":"ContainerDied","Data":"5582fae59151e7650811456c32b253cb4ceb94b4d14ea6349baff1779623c51c"} Jan 30 18:42:40 crc kubenswrapper[4782]: I0130 18:42:40.978676 4782 generic.go:334] "Generic (PLEG): container finished" podID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerID="a1d2f8e465dfddff476a08f4aa4f0a4104d2a711d0b64241daa34c758f91c738" exitCode=0 Jan 30 18:42:40 crc kubenswrapper[4782]: I0130 18:42:40.978739 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" event={"ID":"7e835e31-1014-43e6-8bb6-34ad5fa00bab","Type":"ContainerDied","Data":"a1d2f8e465dfddff476a08f4aa4f0a4104d2a711d0b64241daa34c758f91c738"} Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.369634 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.466846 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util\") pod \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.466916 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcgqz\" (UniqueName: \"kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz\") pod \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.466959 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle\") pod \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\" (UID: \"7e835e31-1014-43e6-8bb6-34ad5fa00bab\") " Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.469022 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle" (OuterVolumeSpecName: "bundle") pod "7e835e31-1014-43e6-8bb6-34ad5fa00bab" (UID: "7e835e31-1014-43e6-8bb6-34ad5fa00bab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.476203 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz" (OuterVolumeSpecName: "kube-api-access-dcgqz") pod "7e835e31-1014-43e6-8bb6-34ad5fa00bab" (UID: "7e835e31-1014-43e6-8bb6-34ad5fa00bab"). InnerVolumeSpecName "kube-api-access-dcgqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.504489 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util" (OuterVolumeSpecName: "util") pod "7e835e31-1014-43e6-8bb6-34ad5fa00bab" (UID: "7e835e31-1014-43e6-8bb6-34ad5fa00bab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.568770 4782 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.568811 4782 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e835e31-1014-43e6-8bb6-34ad5fa00bab-util\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.568824 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcgqz\" (UniqueName: \"kubernetes.io/projected/7e835e31-1014-43e6-8bb6-34ad5fa00bab-kube-api-access-dcgqz\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.999012 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" event={"ID":"7e835e31-1014-43e6-8bb6-34ad5fa00bab","Type":"ContainerDied","Data":"7a0ce6f64971a3683ccc15926b5434538da099ec0d46512f72c3b027b5fd7114"} Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.999053 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m" Jan 30 18:42:42 crc kubenswrapper[4782]: I0130 18:42:42.999083 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a0ce6f64971a3683ccc15926b5434538da099ec0d46512f72c3b027b5fd7114" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.826729 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:42:43 crc kubenswrapper[4782]: E0130 18:42:43.826999 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.827014 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" Jan 30 18:42:43 crc kubenswrapper[4782]: E0130 18:42:43.827026 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="pull" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.827034 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="pull" Jan 30 18:42:43 crc kubenswrapper[4782]: E0130 18:42:43.827048 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="extract" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.827056 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="extract" Jan 30 18:42:43 crc kubenswrapper[4782]: E0130 18:42:43.827071 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="util" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.827079 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="util" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.828283 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="22efd41f-5357-4820-afa4-09733ef60db0" containerName="console" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.828308 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e835e31-1014-43e6-8bb6-34ad5fa00bab" containerName="extract" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.829310 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.840137 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.885693 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.885979 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkgcp\" (UniqueName: \"kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.886080 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.987593 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkgcp\" (UniqueName: \"kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.987696 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.987731 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.988503 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:43 crc kubenswrapper[4782]: I0130 18:42:43.988742 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:44 crc kubenswrapper[4782]: I0130 18:42:44.019494 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkgcp\" (UniqueName: \"kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp\") pod \"community-operators-rw7jz\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:44 crc kubenswrapper[4782]: I0130 18:42:44.169731 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:44 crc kubenswrapper[4782]: I0130 18:42:44.705157 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:42:45 crc kubenswrapper[4782]: I0130 18:42:45.015512 4782 generic.go:334] "Generic (PLEG): container finished" podID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerID="04e89f9a5d1a36f1bb43b3376a22462d130a2791060995cce4f5e2046c4d924a" exitCode=0 Jan 30 18:42:45 crc kubenswrapper[4782]: I0130 18:42:45.015607 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerDied","Data":"04e89f9a5d1a36f1bb43b3376a22462d130a2791060995cce4f5e2046c4d924a"} Jan 30 18:42:45 crc kubenswrapper[4782]: I0130 18:42:45.016074 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerStarted","Data":"6a9c4aa482c62ddadcf38c397ef70ac3a4d095b3f0be1f6e094ade7e6f38df69"} Jan 30 18:42:46 crc kubenswrapper[4782]: I0130 18:42:46.024978 4782 generic.go:334] "Generic (PLEG): container finished" podID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerID="8fe53e0d3838421b90883368dc303e9d301aa7d0f0f00d09323cfada6e572a25" exitCode=0 Jan 30 18:42:46 crc kubenswrapper[4782]: I0130 18:42:46.025040 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerDied","Data":"8fe53e0d3838421b90883368dc303e9d301aa7d0f0f00d09323cfada6e572a25"} Jan 30 18:42:47 crc kubenswrapper[4782]: I0130 18:42:47.034039 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerStarted","Data":"4130610c1292650b7d0352c89611dc7ac35e580886131441f81821079374d8fc"} Jan 30 18:42:47 crc kubenswrapper[4782]: I0130 18:42:47.049303 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rw7jz" podStartSLOduration=2.587898313 podStartE2EDuration="4.049222164s" podCreationTimestamp="2026-01-30 18:42:43 +0000 UTC" firstStartedPulling="2026-01-30 18:42:45.01680821 +0000 UTC m=+741.285186245" lastFinishedPulling="2026-01-30 18:42:46.478132071 +0000 UTC m=+742.746510096" observedRunningTime="2026-01-30 18:42:47.047583253 +0000 UTC m=+743.315961278" watchObservedRunningTime="2026-01-30 18:42:47.049222164 +0000 UTC m=+743.317600219" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.407406 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-597897467b-d7mjb"] Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.408894 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.410870 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-db66z" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.410959 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.411164 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.411737 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.411985 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.421701 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-597897467b-d7mjb"] Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.564102 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-apiservice-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.564157 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-webhook-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.564218 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6dlc\" (UniqueName: \"kubernetes.io/projected/a536d77e-78b4-4ec2-a0d2-80e853e186fb-kube-api-access-z6dlc\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.665136 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-apiservice-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.665194 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-webhook-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.665287 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6dlc\" (UniqueName: \"kubernetes.io/projected/a536d77e-78b4-4ec2-a0d2-80e853e186fb-kube-api-access-z6dlc\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.668424 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs"] Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.669129 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.677668 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.677737 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.677804 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fm68w" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.678190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-apiservice-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.678196 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a536d77e-78b4-4ec2-a0d2-80e853e186fb-webhook-cert\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.687341 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs"] Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.695612 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6dlc\" (UniqueName: \"kubernetes.io/projected/a536d77e-78b4-4ec2-a0d2-80e853e186fb-kube-api-access-z6dlc\") pod \"metallb-operator-controller-manager-597897467b-d7mjb\" (UID: \"a536d77e-78b4-4ec2-a0d2-80e853e186fb\") " pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.723896 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.766033 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrwc\" (UniqueName: \"kubernetes.io/projected/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-kube-api-access-pmrwc\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.766082 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-apiservice-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.766248 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-webhook-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.867120 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-webhook-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.867486 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrwc\" (UniqueName: \"kubernetes.io/projected/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-kube-api-access-pmrwc\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.867520 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-apiservice-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.871341 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-apiservice-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.871629 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-webhook-cert\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.885793 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrwc\" (UniqueName: \"kubernetes.io/projected/c696687d-14f1-4f3b-b9ee-36e3845aa7c2-kube-api-access-pmrwc\") pod \"metallb-operator-webhook-server-bccd97cd9-rmxjs\" (UID: \"c696687d-14f1-4f3b-b9ee-36e3845aa7c2\") " pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:53 crc kubenswrapper[4782]: I0130 18:42:53.978181 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-597897467b-d7mjb"] Jan 30 18:42:53 crc kubenswrapper[4782]: W0130 18:42:53.985705 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda536d77e_78b4_4ec2_a0d2_80e853e186fb.slice/crio-a28318a8e9d672f698a85ce2a9171cc95f730aa316e5aae00eda51856b178780 WatchSource:0}: Error finding container a28318a8e9d672f698a85ce2a9171cc95f730aa316e5aae00eda51856b178780: Status 404 returned error can't find the container with id a28318a8e9d672f698a85ce2a9171cc95f730aa316e5aae00eda51856b178780 Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.030104 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.081163 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" event={"ID":"a536d77e-78b4-4ec2-a0d2-80e853e186fb","Type":"ContainerStarted","Data":"a28318a8e9d672f698a85ce2a9171cc95f730aa316e5aae00eda51856b178780"} Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.170904 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.171204 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.211019 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs"] Jan 30 18:42:54 crc kubenswrapper[4782]: W0130 18:42:54.224341 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc696687d_14f1_4f3b_b9ee_36e3845aa7c2.slice/crio-79ee65e3766195cdebdc15f325177589fa75443d2786c22b04303ff43de00650 WatchSource:0}: Error finding container 79ee65e3766195cdebdc15f325177589fa75443d2786c22b04303ff43de00650: Status 404 returned error can't find the container with id 79ee65e3766195cdebdc15f325177589fa75443d2786c22b04303ff43de00650 Jan 30 18:42:54 crc kubenswrapper[4782]: I0130 18:42:54.226082 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:55 crc kubenswrapper[4782]: I0130 18:42:55.088411 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" event={"ID":"c696687d-14f1-4f3b-b9ee-36e3845aa7c2","Type":"ContainerStarted","Data":"79ee65e3766195cdebdc15f325177589fa75443d2786c22b04303ff43de00650"} Jan 30 18:42:55 crc kubenswrapper[4782]: I0130 18:42:55.135920 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:56 crc kubenswrapper[4782]: I0130 18:42:56.814062 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:42:57 crc kubenswrapper[4782]: I0130 18:42:57.106386 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rw7jz" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="registry-server" containerID="cri-o://4130610c1292650b7d0352c89611dc7ac35e580886131441f81821079374d8fc" gracePeriod=2 Jan 30 18:42:58 crc kubenswrapper[4782]: I0130 18:42:58.115106 4782 generic.go:334] "Generic (PLEG): container finished" podID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerID="4130610c1292650b7d0352c89611dc7ac35e580886131441f81821079374d8fc" exitCode=0 Jan 30 18:42:58 crc kubenswrapper[4782]: I0130 18:42:58.115168 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerDied","Data":"4130610c1292650b7d0352c89611dc7ac35e580886131441f81821079374d8fc"} Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.212756 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.346394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkgcp\" (UniqueName: \"kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp\") pod \"c966cddc-f08f-4d95-89a4-c148d5edff85\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.347399 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content\") pod \"c966cddc-f08f-4d95-89a4-c148d5edff85\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.347447 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities\") pod \"c966cddc-f08f-4d95-89a4-c148d5edff85\" (UID: \"c966cddc-f08f-4d95-89a4-c148d5edff85\") " Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.348185 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities" (OuterVolumeSpecName: "utilities") pod "c966cddc-f08f-4d95-89a4-c148d5edff85" (UID: "c966cddc-f08f-4d95-89a4-c148d5edff85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.352272 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp" (OuterVolumeSpecName: "kube-api-access-bkgcp") pod "c966cddc-f08f-4d95-89a4-c148d5edff85" (UID: "c966cddc-f08f-4d95-89a4-c148d5edff85"). InnerVolumeSpecName "kube-api-access-bkgcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.400152 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c966cddc-f08f-4d95-89a4-c148d5edff85" (UID: "c966cddc-f08f-4d95-89a4-c148d5edff85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.448695 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.448751 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkgcp\" (UniqueName: \"kubernetes.io/projected/c966cddc-f08f-4d95-89a4-c148d5edff85-kube-api-access-bkgcp\") on node \"crc\" DevicePath \"\"" Jan 30 18:42:59 crc kubenswrapper[4782]: I0130 18:42:59.448772 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c966cddc-f08f-4d95-89a4-c148d5edff85-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.135661 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw7jz" event={"ID":"c966cddc-f08f-4d95-89a4-c148d5edff85","Type":"ContainerDied","Data":"6a9c4aa482c62ddadcf38c397ef70ac3a4d095b3f0be1f6e094ade7e6f38df69"} Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.136026 4782 scope.go:117] "RemoveContainer" containerID="4130610c1292650b7d0352c89611dc7ac35e580886131441f81821079374d8fc" Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.135787 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw7jz" Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.165814 4782 scope.go:117] "RemoveContainer" containerID="8fe53e0d3838421b90883368dc303e9d301aa7d0f0f00d09323cfada6e572a25" Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.192757 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.213431 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rw7jz"] Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.229934 4782 scope.go:117] "RemoveContainer" containerID="04e89f9a5d1a36f1bb43b3376a22462d130a2791060995cce4f5e2046c4d924a" Jan 30 18:43:00 crc kubenswrapper[4782]: I0130 18:43:00.422028 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" path="/var/lib/kubelet/pods/c966cddc-f08f-4d95-89a4-c148d5edff85/volumes" Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.143546 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" event={"ID":"a536d77e-78b4-4ec2-a0d2-80e853e186fb","Type":"ContainerStarted","Data":"f1726322ace07b464af62293d03810bd5bbb74fd386d5175aed91e5619a2d095"} Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.143704 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.145169 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" event={"ID":"c696687d-14f1-4f3b-b9ee-36e3845aa7c2","Type":"ContainerStarted","Data":"d04d24aa233837c2e93fe933b9156c8fdbb2e6d23183c7568285e60ab43e8357"} Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.145404 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.171006 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" podStartSLOduration=3.203075126 podStartE2EDuration="8.170985942s" podCreationTimestamp="2026-01-30 18:42:53 +0000 UTC" firstStartedPulling="2026-01-30 18:42:53.987269093 +0000 UTC m=+750.255647118" lastFinishedPulling="2026-01-30 18:42:58.955179869 +0000 UTC m=+755.223557934" observedRunningTime="2026-01-30 18:43:01.166843589 +0000 UTC m=+757.435221614" watchObservedRunningTime="2026-01-30 18:43:01.170985942 +0000 UTC m=+757.439363977" Jan 30 18:43:01 crc kubenswrapper[4782]: I0130 18:43:01.196295 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" podStartSLOduration=3.433840826 podStartE2EDuration="8.196273388s" podCreationTimestamp="2026-01-30 18:42:53 +0000 UTC" firstStartedPulling="2026-01-30 18:42:54.2275973 +0000 UTC m=+750.495975325" lastFinishedPulling="2026-01-30 18:42:58.990029862 +0000 UTC m=+755.258407887" observedRunningTime="2026-01-30 18:43:01.191182432 +0000 UTC m=+757.459560457" watchObservedRunningTime="2026-01-30 18:43:01.196273388 +0000 UTC m=+757.464651423" Jan 30 18:43:14 crc kubenswrapper[4782]: I0130 18:43:14.035194 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-bccd97cd9-rmxjs" Jan 30 18:43:19 crc kubenswrapper[4782]: I0130 18:43:19.792821 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:43:19 crc kubenswrapper[4782]: I0130 18:43:19.793939 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:43:33 crc kubenswrapper[4782]: I0130 18:43:33.726721 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-597897467b-d7mjb" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.617952 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg"] Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.618720 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="extract-content" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.618749 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="extract-content" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.618767 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="registry-server" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.618781 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="registry-server" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.618803 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="extract-utilities" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.618815 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="extract-utilities" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.619001 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c966cddc-f08f-4d95-89a4-c148d5edff85" containerName="registry-server" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.620111 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.622647 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.623034 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gvkj8" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.629124 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg"] Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.632721 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ea755b-acbd-4894-9070-356cb15f18d3-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.632794 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvsw\" (UniqueName: \"kubernetes.io/projected/67ea755b-acbd-4894-9070-356cb15f18d3-kube-api-access-vbvsw\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.636943 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wt4wf"] Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.641271 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.643366 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.643746 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.702403 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jqzpm"] Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.703672 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.706576 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.706582 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.706582 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-87c6q" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.707788 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734458 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjh7\" (UniqueName: \"kubernetes.io/projected/1da61d3b-efb6-453e-8e4b-ca98c629c39a-kube-api-access-6tjh7\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734479 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-nlj8p"] Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734508 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734533 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-sockets\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734554 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ea755b-acbd-4894-9070-356cb15f18d3-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734599 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvsw\" (UniqueName: \"kubernetes.io/projected/67ea755b-acbd-4894-9070-356cb15f18d3-kube-api-access-vbvsw\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734619 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734639 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/824018fe-7708-4c75-aaac-19bfb9f22405-metallb-excludel2\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734660 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5r8d\" (UniqueName: \"kubernetes.io/projected/824018fe-7708-4c75-aaac-19bfb9f22405-kube-api-access-m5r8d\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734675 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-conf\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734693 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734711 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics-certs\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734730 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-startup\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.734748 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-reloader\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.735556 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.740732 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.741013 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-nlj8p"] Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.742898 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/67ea755b-acbd-4894-9070-356cb15f18d3-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.757155 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvsw\" (UniqueName: \"kubernetes.io/projected/67ea755b-acbd-4894-9070-356cb15f18d3-kube-api-access-vbvsw\") pod \"frr-k8s-webhook-server-7df86c4f6c-m7rqg\" (UID: \"67ea755b-acbd-4894-9070-356cb15f18d3\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835660 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835718 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835755 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/824018fe-7708-4c75-aaac-19bfb9f22405-metallb-excludel2\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835786 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5r8d\" (UniqueName: \"kubernetes.io/projected/824018fe-7708-4c75-aaac-19bfb9f22405-kube-api-access-m5r8d\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835807 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-conf\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835838 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.835849 4782 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835861 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cpkl\" (UniqueName: \"kubernetes.io/projected/b45e2233-e51f-4f71-bc45-cd73fa8302de-kube-api-access-7cpkl\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835930 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-cert\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.835949 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs podName:824018fe-7708-4c75-aaac-19bfb9f22405 nodeName:}" failed. No retries permitted until 2026-01-30 18:43:35.335929568 +0000 UTC m=+791.604307593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs") pod "speaker-jqzpm" (UID: "824018fe-7708-4c75-aaac-19bfb9f22405") : secret "speaker-certs-secret" not found Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.835979 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics-certs\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836024 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-startup\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836049 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-reloader\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836097 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjh7\" (UniqueName: \"kubernetes.io/projected/1da61d3b-efb6-453e-8e4b-ca98c629c39a-kube-api-access-6tjh7\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836143 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836169 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-sockets\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836315 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836524 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-conf\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.836552 4782 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.836582 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist podName:824018fe-7708-4c75-aaac-19bfb9f22405 nodeName:}" failed. No retries permitted until 2026-01-30 18:43:35.336573474 +0000 UTC m=+791.604951499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist") pod "speaker-jqzpm" (UID: "824018fe-7708-4c75-aaac-19bfb9f22405") : secret "metallb-memberlist" not found Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836551 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-sockets\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836609 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1da61d3b-efb6-453e-8e4b-ca98c629c39a-reloader\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.836622 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/824018fe-7708-4c75-aaac-19bfb9f22405-metallb-excludel2\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.837080 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1da61d3b-efb6-453e-8e4b-ca98c629c39a-frr-startup\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.849794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da61d3b-efb6-453e-8e4b-ca98c629c39a-metrics-certs\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.853689 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5r8d\" (UniqueName: \"kubernetes.io/projected/824018fe-7708-4c75-aaac-19bfb9f22405-kube-api-access-m5r8d\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.854055 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjh7\" (UniqueName: \"kubernetes.io/projected/1da61d3b-efb6-453e-8e4b-ca98c629c39a-kube-api-access-6tjh7\") pod \"frr-k8s-wt4wf\" (UID: \"1da61d3b-efb6-453e-8e4b-ca98c629c39a\") " pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.934948 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.938292 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cpkl\" (UniqueName: \"kubernetes.io/projected/b45e2233-e51f-4f71-bc45-cd73fa8302de-kube-api-access-7cpkl\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.938435 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-cert\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.938648 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.938790 4782 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 30 18:43:34 crc kubenswrapper[4782]: E0130 18:43:34.938889 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs podName:b45e2233-e51f-4f71-bc45-cd73fa8302de nodeName:}" failed. No retries permitted until 2026-01-30 18:43:35.43886613 +0000 UTC m=+791.707244155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs") pod "controller-6968d8fdc4-nlj8p" (UID: "b45e2233-e51f-4f71-bc45-cd73fa8302de") : secret "controller-certs-secret" not found Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.940449 4782 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.955778 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-cert\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.957338 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:34 crc kubenswrapper[4782]: I0130 18:43:34.959913 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cpkl\" (UniqueName: \"kubernetes.io/projected/b45e2233-e51f-4f71-bc45-cd73fa8302de-kube-api-access-7cpkl\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.152333 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg"] Jan 30 18:43:35 crc kubenswrapper[4782]: W0130 18:43:35.159070 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ea755b_acbd_4894_9070_356cb15f18d3.slice/crio-512d3733a0243b45907b57e6cc3d54e71633e8ea93a71d5ce4a5039b9854c263 WatchSource:0}: Error finding container 512d3733a0243b45907b57e6cc3d54e71633e8ea93a71d5ce4a5039b9854c263: Status 404 returned error can't find the container with id 512d3733a0243b45907b57e6cc3d54e71633e8ea93a71d5ce4a5039b9854c263 Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.345032 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.345276 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:35 crc kubenswrapper[4782]: E0130 18:43:35.345483 4782 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 18:43:35 crc kubenswrapper[4782]: E0130 18:43:35.345604 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist podName:824018fe-7708-4c75-aaac-19bfb9f22405 nodeName:}" failed. No retries permitted until 2026-01-30 18:43:36.345577552 +0000 UTC m=+792.613955607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist") pod "speaker-jqzpm" (UID: "824018fe-7708-4c75-aaac-19bfb9f22405") : secret "metallb-memberlist" not found Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.350861 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-metrics-certs\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.446905 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.449672 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b45e2233-e51f-4f71-bc45-cd73fa8302de-metrics-certs\") pod \"controller-6968d8fdc4-nlj8p\" (UID: \"b45e2233-e51f-4f71-bc45-cd73fa8302de\") " pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.661268 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" event={"ID":"67ea755b-acbd-4894-9070-356cb15f18d3","Type":"ContainerStarted","Data":"512d3733a0243b45907b57e6cc3d54e71633e8ea93a71d5ce4a5039b9854c263"} Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.663158 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"c550dbe3fdcffbde3e32f05743b50b0570f4d12c2108dd8557de03533a321565"} Jan 30 18:43:35 crc kubenswrapper[4782]: I0130 18:43:35.689790 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.006393 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-nlj8p"] Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.359476 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.370060 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/824018fe-7708-4c75-aaac-19bfb9f22405-memberlist\") pod \"speaker-jqzpm\" (UID: \"824018fe-7708-4c75-aaac-19bfb9f22405\") " pod="metallb-system/speaker-jqzpm" Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.516063 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jqzpm" Jan 30 18:43:36 crc kubenswrapper[4782]: W0130 18:43:36.549706 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824018fe_7708_4c75_aaac_19bfb9f22405.slice/crio-e1467e2adbc560e8e5932f60f37e196df04b76bc54d127fa662c8e31036405af WatchSource:0}: Error finding container e1467e2adbc560e8e5932f60f37e196df04b76bc54d127fa662c8e31036405af: Status 404 returned error can't find the container with id e1467e2adbc560e8e5932f60f37e196df04b76bc54d127fa662c8e31036405af Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.670323 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jqzpm" event={"ID":"824018fe-7708-4c75-aaac-19bfb9f22405","Type":"ContainerStarted","Data":"e1467e2adbc560e8e5932f60f37e196df04b76bc54d127fa662c8e31036405af"} Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.672686 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-nlj8p" event={"ID":"b45e2233-e51f-4f71-bc45-cd73fa8302de","Type":"ContainerStarted","Data":"5b7dd062614562a902b04152498a312aebb11d8de6efefb3af693bdf36705e98"} Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.672721 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-nlj8p" event={"ID":"b45e2233-e51f-4f71-bc45-cd73fa8302de","Type":"ContainerStarted","Data":"bdd08fb4940d234ef14f95ce80664e015dde98d1a555c53c50091eb27d3d73a3"} Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.672735 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-nlj8p" event={"ID":"b45e2233-e51f-4f71-bc45-cd73fa8302de","Type":"ContainerStarted","Data":"12c7278dae253135dde7cbfc878e93269c0a165f7557f0f835270075e7bad8ed"} Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.673841 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:36 crc kubenswrapper[4782]: I0130 18:43:36.696120 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-nlj8p" podStartSLOduration=2.696098188 podStartE2EDuration="2.696098188s" podCreationTimestamp="2026-01-30 18:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:43:36.693817511 +0000 UTC m=+792.962195546" watchObservedRunningTime="2026-01-30 18:43:36.696098188 +0000 UTC m=+792.964476223" Jan 30 18:43:37 crc kubenswrapper[4782]: I0130 18:43:37.688525 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jqzpm" event={"ID":"824018fe-7708-4c75-aaac-19bfb9f22405","Type":"ContainerStarted","Data":"04f71c842bc9c3e8a9c090a2e5311375471864477124bb82302ceabdfbf6c01d"} Jan 30 18:43:37 crc kubenswrapper[4782]: I0130 18:43:37.688573 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jqzpm" event={"ID":"824018fe-7708-4c75-aaac-19bfb9f22405","Type":"ContainerStarted","Data":"15473e47fe723fe4d0ece16f77c40585400aea133f237370add74a62eb921174"} Jan 30 18:43:38 crc kubenswrapper[4782]: I0130 18:43:38.698092 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jqzpm" Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.729994 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" event={"ID":"67ea755b-acbd-4894-9070-356cb15f18d3","Type":"ContainerStarted","Data":"63d195a59c8906ca01a244e90e2eae0dc5fdd7de31fc21e56194ff6aab1811f3"} Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.730862 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.732871 4782 generic.go:334] "Generic (PLEG): container finished" podID="1da61d3b-efb6-453e-8e4b-ca98c629c39a" containerID="33c913866dae4465f15b46e1852567fe5411394ef9bba872b32b24fb72b49205" exitCode=0 Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.732920 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerDied","Data":"33c913866dae4465f15b46e1852567fe5411394ef9bba872b32b24fb72b49205"} Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.755618 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" podStartSLOduration=1.460286833 podStartE2EDuration="8.755598462s" podCreationTimestamp="2026-01-30 18:43:34 +0000 UTC" firstStartedPulling="2026-01-30 18:43:35.161083448 +0000 UTC m=+791.429461473" lastFinishedPulling="2026-01-30 18:43:42.456395077 +0000 UTC m=+798.724773102" observedRunningTime="2026-01-30 18:43:42.749575323 +0000 UTC m=+799.017953358" watchObservedRunningTime="2026-01-30 18:43:42.755598462 +0000 UTC m=+799.023976507" Jan 30 18:43:42 crc kubenswrapper[4782]: I0130 18:43:42.755833 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jqzpm" podStartSLOduration=8.755829528 podStartE2EDuration="8.755829528s" podCreationTimestamp="2026-01-30 18:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:43:37.706800662 +0000 UTC m=+793.975178707" watchObservedRunningTime="2026-01-30 18:43:42.755829528 +0000 UTC m=+799.024207563" Jan 30 18:43:43 crc kubenswrapper[4782]: I0130 18:43:43.742326 4782 generic.go:334] "Generic (PLEG): container finished" podID="1da61d3b-efb6-453e-8e4b-ca98c629c39a" containerID="841bc90fe97e25a9a5ebd1de9d1ff9721ce0e5d80b186e1760cc612176d4b0c1" exitCode=0 Jan 30 18:43:43 crc kubenswrapper[4782]: I0130 18:43:43.742423 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerDied","Data":"841bc90fe97e25a9a5ebd1de9d1ff9721ce0e5d80b186e1760cc612176d4b0c1"} Jan 30 18:43:44 crc kubenswrapper[4782]: I0130 18:43:44.751818 4782 generic.go:334] "Generic (PLEG): container finished" podID="1da61d3b-efb6-453e-8e4b-ca98c629c39a" containerID="1d426f92d4c553c79744c7324a722a039a3e2220cf93c99b85f680855f000e10" exitCode=0 Jan 30 18:43:44 crc kubenswrapper[4782]: I0130 18:43:44.751864 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerDied","Data":"1d426f92d4c553c79744c7324a722a039a3e2220cf93c99b85f680855f000e10"} Jan 30 18:43:45 crc kubenswrapper[4782]: I0130 18:43:45.767025 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"25f547907981385f9e4c6ed5a03fdd697bbd99f7112e855959d677a241c9335f"} Jan 30 18:43:45 crc kubenswrapper[4782]: I0130 18:43:45.767556 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"8dbf87ff780e73a68f308d43eca0d212033f8a311b41ec01fcc00647a3632d4e"} Jan 30 18:43:45 crc kubenswrapper[4782]: I0130 18:43:45.767588 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"2be96fceefbb20aafd8947b5347ee5a298149ffd1111de7fbcdbd47b469d5a33"} Jan 30 18:43:45 crc kubenswrapper[4782]: I0130 18:43:45.767614 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"ba939a1eaa1403c0fa09c7fab7f65a6b116ed6858511d63009419c427e1a7f97"} Jan 30 18:43:45 crc kubenswrapper[4782]: I0130 18:43:45.767638 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"089f1ccfa6a237d6e01e2515ff97002a03aba2f886887d78edf8186dbcd2ee21"} Jan 30 18:43:46 crc kubenswrapper[4782]: I0130 18:43:46.524283 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jqzpm" Jan 30 18:43:46 crc kubenswrapper[4782]: I0130 18:43:46.782824 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wt4wf" event={"ID":"1da61d3b-efb6-453e-8e4b-ca98c629c39a","Type":"ContainerStarted","Data":"fd146f092e9a43d34efe27acb78c391f7ea54e56f7476d1934bf9f914d573c08"} Jan 30 18:43:46 crc kubenswrapper[4782]: I0130 18:43:46.783061 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:46 crc kubenswrapper[4782]: I0130 18:43:46.816439 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wt4wf" podStartSLOduration=5.451962001 podStartE2EDuration="12.816418043s" podCreationTimestamp="2026-01-30 18:43:34 +0000 UTC" firstStartedPulling="2026-01-30 18:43:35.093704308 +0000 UTC m=+791.362082333" lastFinishedPulling="2026-01-30 18:43:42.45816036 +0000 UTC m=+798.726538375" observedRunningTime="2026-01-30 18:43:46.815612133 +0000 UTC m=+803.083990158" watchObservedRunningTime="2026-01-30 18:43:46.816418043 +0000 UTC m=+803.084796078" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.502354 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.504598 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.506860 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6zpjc" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.507026 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.508170 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.508387 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.563883 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd5j\" (UniqueName: \"kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j\") pod \"openstack-operator-index-57gzf\" (UID: \"7baed58d-56ba-4d44-a000-f14c036edecf\") " pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.664935 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd5j\" (UniqueName: \"kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j\") pod \"openstack-operator-index-57gzf\" (UID: \"7baed58d-56ba-4d44-a000-f14c036edecf\") " pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.686308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd5j\" (UniqueName: \"kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j\") pod \"openstack-operator-index-57gzf\" (UID: \"7baed58d-56ba-4d44-a000-f14c036edecf\") " pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.793373 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.793744 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.841708 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:49 crc kubenswrapper[4782]: I0130 18:43:49.958217 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:50 crc kubenswrapper[4782]: I0130 18:43:50.001600 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:50 crc kubenswrapper[4782]: I0130 18:43:50.246747 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:50 crc kubenswrapper[4782]: I0130 18:43:50.809859 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57gzf" event={"ID":"7baed58d-56ba-4d44-a000-f14c036edecf","Type":"ContainerStarted","Data":"ccecfc0bdfbb383e8e5e8288b64e7f2612354327166660e91b8fc1dfa4750019"} Jan 30 18:43:52 crc kubenswrapper[4782]: I0130 18:43:52.872028 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.484112 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v7mq6"] Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.485328 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.500758 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7mq6"] Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.529862 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5t2\" (UniqueName: \"kubernetes.io/projected/09760161-4b39-4185-9c1e-917ba1924171-kube-api-access-tg5t2\") pod \"openstack-operator-index-v7mq6\" (UID: \"09760161-4b39-4185-9c1e-917ba1924171\") " pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.630929 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5t2\" (UniqueName: \"kubernetes.io/projected/09760161-4b39-4185-9c1e-917ba1924171-kube-api-access-tg5t2\") pod \"openstack-operator-index-v7mq6\" (UID: \"09760161-4b39-4185-9c1e-917ba1924171\") " pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.657434 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5t2\" (UniqueName: \"kubernetes.io/projected/09760161-4b39-4185-9c1e-917ba1924171-kube-api-access-tg5t2\") pod \"openstack-operator-index-v7mq6\" (UID: \"09760161-4b39-4185-9c1e-917ba1924171\") " pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:43:53 crc kubenswrapper[4782]: I0130 18:43:53.809963 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.444061 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v7mq6"] Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.840978 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57gzf" event={"ID":"7baed58d-56ba-4d44-a000-f14c036edecf","Type":"ContainerStarted","Data":"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8"} Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.841065 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-57gzf" podUID="7baed58d-56ba-4d44-a000-f14c036edecf" containerName="registry-server" containerID="cri-o://221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8" gracePeriod=2 Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.845058 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7mq6" event={"ID":"09760161-4b39-4185-9c1e-917ba1924171","Type":"ContainerStarted","Data":"088c6c13bc4857449f461352971a1c3f94bc1233d56589cb654c9e7a65dfb60e"} Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.845121 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v7mq6" event={"ID":"09760161-4b39-4185-9c1e-917ba1924171","Type":"ContainerStarted","Data":"b724cd81d00866e0ca513c95f2f30b8b2af1deab247d72e690750a4c9008a93a"} Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.868954 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-57gzf" podStartSLOduration=1.920718592 podStartE2EDuration="5.868930421s" podCreationTimestamp="2026-01-30 18:43:49 +0000 UTC" firstStartedPulling="2026-01-30 18:43:50.258582989 +0000 UTC m=+806.526961014" lastFinishedPulling="2026-01-30 18:43:54.206794818 +0000 UTC m=+810.475172843" observedRunningTime="2026-01-30 18:43:54.860632195 +0000 UTC m=+811.129010270" watchObservedRunningTime="2026-01-30 18:43:54.868930421 +0000 UTC m=+811.137308466" Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.880864 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v7mq6" podStartSLOduration=1.82451222 podStartE2EDuration="1.880841716s" podCreationTimestamp="2026-01-30 18:43:53 +0000 UTC" firstStartedPulling="2026-01-30 18:43:54.468367822 +0000 UTC m=+810.736745847" lastFinishedPulling="2026-01-30 18:43:54.524697308 +0000 UTC m=+810.793075343" observedRunningTime="2026-01-30 18:43:54.8785551 +0000 UTC m=+811.146933145" watchObservedRunningTime="2026-01-30 18:43:54.880841716 +0000 UTC m=+811.149219741" Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.941992 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-m7rqg" Jan 30 18:43:54 crc kubenswrapper[4782]: I0130 18:43:54.968399 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wt4wf" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.296952 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.362528 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jd5j\" (UniqueName: \"kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j\") pod \"7baed58d-56ba-4d44-a000-f14c036edecf\" (UID: \"7baed58d-56ba-4d44-a000-f14c036edecf\") " Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.378424 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j" (OuterVolumeSpecName: "kube-api-access-8jd5j") pod "7baed58d-56ba-4d44-a000-f14c036edecf" (UID: "7baed58d-56ba-4d44-a000-f14c036edecf"). InnerVolumeSpecName "kube-api-access-8jd5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.464437 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jd5j\" (UniqueName: \"kubernetes.io/projected/7baed58d-56ba-4d44-a000-f14c036edecf-kube-api-access-8jd5j\") on node \"crc\" DevicePath \"\"" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.698889 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-nlj8p" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.855169 4782 generic.go:334] "Generic (PLEG): container finished" podID="7baed58d-56ba-4d44-a000-f14c036edecf" containerID="221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8" exitCode=0 Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.855277 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-57gzf" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.855282 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57gzf" event={"ID":"7baed58d-56ba-4d44-a000-f14c036edecf","Type":"ContainerDied","Data":"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8"} Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.855366 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-57gzf" event={"ID":"7baed58d-56ba-4d44-a000-f14c036edecf","Type":"ContainerDied","Data":"ccecfc0bdfbb383e8e5e8288b64e7f2612354327166660e91b8fc1dfa4750019"} Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.855398 4782 scope.go:117] "RemoveContainer" containerID="221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.887980 4782 scope.go:117] "RemoveContainer" containerID="221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8" Jan 30 18:43:55 crc kubenswrapper[4782]: E0130 18:43:55.891405 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8\": container with ID starting with 221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8 not found: ID does not exist" containerID="221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.891461 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8"} err="failed to get container status \"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8\": rpc error: code = NotFound desc = could not find container \"221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8\": container with ID starting with 221e81fc3dee8467d28a3954af78b07564f812428d1b5bd8fce81d46ec9fcac8 not found: ID does not exist" Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.905846 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:55 crc kubenswrapper[4782]: I0130 18:43:55.912341 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-57gzf"] Jan 30 18:43:56 crc kubenswrapper[4782]: I0130 18:43:56.424890 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7baed58d-56ba-4d44-a000-f14c036edecf" path="/var/lib/kubelet/pods/7baed58d-56ba-4d44-a000-f14c036edecf/volumes" Jan 30 18:44:03 crc kubenswrapper[4782]: I0130 18:44:03.810491 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:44:03 crc kubenswrapper[4782]: I0130 18:44:03.811214 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:44:03 crc kubenswrapper[4782]: I0130 18:44:03.841662 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:44:03 crc kubenswrapper[4782]: I0130 18:44:03.935883 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-v7mq6" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.793918 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc"] Jan 30 18:44:10 crc kubenswrapper[4782]: E0130 18:44:10.794793 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baed58d-56ba-4d44-a000-f14c036edecf" containerName="registry-server" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.794810 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baed58d-56ba-4d44-a000-f14c036edecf" containerName="registry-server" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.794986 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7baed58d-56ba-4d44-a000-f14c036edecf" containerName="registry-server" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.795957 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.797917 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5tstj" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.808462 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc"] Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.887376 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.887488 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.887559 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ss95\" (UniqueName: \"kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.989081 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.989180 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ss95\" (UniqueName: \"kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.989378 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.989841 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:10 crc kubenswrapper[4782]: I0130 18:44:10.989935 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:11 crc kubenswrapper[4782]: I0130 18:44:11.019915 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ss95\" (UniqueName: \"kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95\") pod \"e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:11 crc kubenswrapper[4782]: I0130 18:44:11.153474 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:11 crc kubenswrapper[4782]: I0130 18:44:11.611489 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc"] Jan 30 18:44:11 crc kubenswrapper[4782]: I0130 18:44:11.968800 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerStarted","Data":"b2d4a08801ec9f17c284fdb85992c3a0878da1616a064b2114688b399d0ba761"} Jan 30 18:44:11 crc kubenswrapper[4782]: I0130 18:44:11.968867 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerStarted","Data":"6247015e0ddfb20c2b379d8f27dc3a36bf09186c521f55b18f78d4209c3e3d36"} Jan 30 18:44:12 crc kubenswrapper[4782]: I0130 18:44:12.980325 4782 generic.go:334] "Generic (PLEG): container finished" podID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerID="b2d4a08801ec9f17c284fdb85992c3a0878da1616a064b2114688b399d0ba761" exitCode=0 Jan 30 18:44:12 crc kubenswrapper[4782]: I0130 18:44:12.980864 4782 generic.go:334] "Generic (PLEG): container finished" podID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerID="fb149bcdc623c8fa22342862c01ccde21c035a6ce47738e179ff6c76c468ec98" exitCode=0 Jan 30 18:44:12 crc kubenswrapper[4782]: I0130 18:44:12.980397 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerDied","Data":"b2d4a08801ec9f17c284fdb85992c3a0878da1616a064b2114688b399d0ba761"} Jan 30 18:44:12 crc kubenswrapper[4782]: I0130 18:44:12.980932 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerDied","Data":"fb149bcdc623c8fa22342862c01ccde21c035a6ce47738e179ff6c76c468ec98"} Jan 30 18:44:13 crc kubenswrapper[4782]: I0130 18:44:13.988833 4782 generic.go:334] "Generic (PLEG): container finished" podID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerID="3509957a1cf99c5352ef26d8c1402924c3f05ad99b450795253dc5983dee671c" exitCode=0 Jan 30 18:44:13 crc kubenswrapper[4782]: I0130 18:44:13.988981 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerDied","Data":"3509957a1cf99c5352ef26d8c1402924c3f05ad99b450795253dc5983dee671c"} Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.254672 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.348463 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle\") pod \"cda5a397-ca19-4c00-97b7-f92b445ddecb\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.348874 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util\") pod \"cda5a397-ca19-4c00-97b7-f92b445ddecb\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.348935 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ss95\" (UniqueName: \"kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95\") pod \"cda5a397-ca19-4c00-97b7-f92b445ddecb\" (UID: \"cda5a397-ca19-4c00-97b7-f92b445ddecb\") " Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.349117 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle" (OuterVolumeSpecName: "bundle") pod "cda5a397-ca19-4c00-97b7-f92b445ddecb" (UID: "cda5a397-ca19-4c00-97b7-f92b445ddecb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.349334 4782 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.353343 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95" (OuterVolumeSpecName: "kube-api-access-7ss95") pod "cda5a397-ca19-4c00-97b7-f92b445ddecb" (UID: "cda5a397-ca19-4c00-97b7-f92b445ddecb"). InnerVolumeSpecName "kube-api-access-7ss95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.366411 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util" (OuterVolumeSpecName: "util") pod "cda5a397-ca19-4c00-97b7-f92b445ddecb" (UID: "cda5a397-ca19-4c00-97b7-f92b445ddecb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.452074 4782 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda5a397-ca19-4c00-97b7-f92b445ddecb-util\") on node \"crc\" DevicePath \"\"" Jan 30 18:44:15 crc kubenswrapper[4782]: I0130 18:44:15.452146 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ss95\" (UniqueName: \"kubernetes.io/projected/cda5a397-ca19-4c00-97b7-f92b445ddecb-kube-api-access-7ss95\") on node \"crc\" DevicePath \"\"" Jan 30 18:44:16 crc kubenswrapper[4782]: I0130 18:44:16.006409 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" event={"ID":"cda5a397-ca19-4c00-97b7-f92b445ddecb","Type":"ContainerDied","Data":"6247015e0ddfb20c2b379d8f27dc3a36bf09186c521f55b18f78d4209c3e3d36"} Jan 30 18:44:16 crc kubenswrapper[4782]: I0130 18:44:16.006503 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6247015e0ddfb20c2b379d8f27dc3a36bf09186c521f55b18f78d4209c3e3d36" Jan 30 18:44:16 crc kubenswrapper[4782]: I0130 18:44:16.006455 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc" Jan 30 18:44:19 crc kubenswrapper[4782]: I0130 18:44:19.793259 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:44:19 crc kubenswrapper[4782]: I0130 18:44:19.794817 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:44:19 crc kubenswrapper[4782]: I0130 18:44:19.794924 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:44:19 crc kubenswrapper[4782]: I0130 18:44:19.795617 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:44:19 crc kubenswrapper[4782]: I0130 18:44:19.795752 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f" gracePeriod=600 Jan 30 18:44:21 crc kubenswrapper[4782]: I0130 18:44:21.038964 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f" exitCode=0 Jan 30 18:44:21 crc kubenswrapper[4782]: I0130 18:44:21.039034 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f"} Jan 30 18:44:21 crc kubenswrapper[4782]: I0130 18:44:21.039414 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e"} Jan 30 18:44:21 crc kubenswrapper[4782]: I0130 18:44:21.039437 4782 scope.go:117] "RemoveContainer" containerID="2fa84040f4bd5b0e4284745a145cd040aa730789685e2959938e53c4ffb71cd3" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.435058 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4"] Jan 30 18:44:23 crc kubenswrapper[4782]: E0130 18:44:23.435766 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="util" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.435787 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="util" Jan 30 18:44:23 crc kubenswrapper[4782]: E0130 18:44:23.435809 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="pull" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.435821 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="pull" Jan 30 18:44:23 crc kubenswrapper[4782]: E0130 18:44:23.435849 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="extract" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.435863 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="extract" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.436085 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda5a397-ca19-4c00-97b7-f92b445ddecb" containerName="extract" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.436770 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.439576 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-zx5ww" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.461530 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4"] Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.556830 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kk8\" (UniqueName: \"kubernetes.io/projected/48f3d327-7068-48e5-bd16-e8983d7dce53-kube-api-access-j6kk8\") pod \"openstack-operator-controller-init-678fbb89d4-gxzc4\" (UID: \"48f3d327-7068-48e5-bd16-e8983d7dce53\") " pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.658491 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kk8\" (UniqueName: \"kubernetes.io/projected/48f3d327-7068-48e5-bd16-e8983d7dce53-kube-api-access-j6kk8\") pod \"openstack-operator-controller-init-678fbb89d4-gxzc4\" (UID: \"48f3d327-7068-48e5-bd16-e8983d7dce53\") " pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.684148 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kk8\" (UniqueName: \"kubernetes.io/projected/48f3d327-7068-48e5-bd16-e8983d7dce53-kube-api-access-j6kk8\") pod \"openstack-operator-controller-init-678fbb89d4-gxzc4\" (UID: \"48f3d327-7068-48e5-bd16-e8983d7dce53\") " pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:23 crc kubenswrapper[4782]: I0130 18:44:23.753024 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:24 crc kubenswrapper[4782]: I0130 18:44:24.296904 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4"] Jan 30 18:44:24 crc kubenswrapper[4782]: W0130 18:44:24.311840 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f3d327_7068_48e5_bd16_e8983d7dce53.slice/crio-3ef843b6967e4fe039625c7c99fb7a1378e4f21a74e0a5db6f5a6412c9121fba WatchSource:0}: Error finding container 3ef843b6967e4fe039625c7c99fb7a1378e4f21a74e0a5db6f5a6412c9121fba: Status 404 returned error can't find the container with id 3ef843b6967e4fe039625c7c99fb7a1378e4f21a74e0a5db6f5a6412c9121fba Jan 30 18:44:25 crc kubenswrapper[4782]: I0130 18:44:25.078183 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" event={"ID":"48f3d327-7068-48e5-bd16-e8983d7dce53","Type":"ContainerStarted","Data":"3ef843b6967e4fe039625c7c99fb7a1378e4f21a74e0a5db6f5a6412c9121fba"} Jan 30 18:44:30 crc kubenswrapper[4782]: I0130 18:44:30.118836 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" event={"ID":"48f3d327-7068-48e5-bd16-e8983d7dce53","Type":"ContainerStarted","Data":"82d64f823407f6b95e0c08060c18d80399ad7e85456211b7bfaed7b7629e139e"} Jan 30 18:44:30 crc kubenswrapper[4782]: I0130 18:44:30.120323 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:44:30 crc kubenswrapper[4782]: I0130 18:44:30.149247 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" podStartSLOduration=2.486535048 podStartE2EDuration="7.149200842s" podCreationTimestamp="2026-01-30 18:44:23 +0000 UTC" firstStartedPulling="2026-01-30 18:44:24.314597389 +0000 UTC m=+840.582975434" lastFinishedPulling="2026-01-30 18:44:28.977263183 +0000 UTC m=+845.245641228" observedRunningTime="2026-01-30 18:44:30.143627414 +0000 UTC m=+846.412005439" watchObservedRunningTime="2026-01-30 18:44:30.149200842 +0000 UTC m=+846.417578867" Jan 30 18:44:43 crc kubenswrapper[4782]: I0130 18:44:43.756204 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-678fbb89d4-gxzc4" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.161803 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl"] Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.163278 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.165334 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.165647 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.173185 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl"] Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.287665 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.287924 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.288113 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgtr5\" (UniqueName: \"kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.389379 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.389783 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.389956 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgtr5\" (UniqueName: \"kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.391273 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.400794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.413864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgtr5\" (UniqueName: \"kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5\") pod \"collect-profiles-29496645-474hl\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.484867 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:00 crc kubenswrapper[4782]: I0130 18:45:00.955896 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl"] Jan 30 18:45:01 crc kubenswrapper[4782]: I0130 18:45:01.358283 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" event={"ID":"c4b16342-989f-4f2b-8eef-1e638aeb7858","Type":"ContainerStarted","Data":"3de6827731405bb2f4c5bcef02cff0f66ee60f560a6480256e0cf7d894c02836"} Jan 30 18:45:01 crc kubenswrapper[4782]: I0130 18:45:01.358551 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" event={"ID":"c4b16342-989f-4f2b-8eef-1e638aeb7858","Type":"ContainerStarted","Data":"57ef45184bab0a769dd94ec7ab09d5686ea9ba3ee536dc88920fe3e5e3ec5d5c"} Jan 30 18:45:01 crc kubenswrapper[4782]: I0130 18:45:01.371142 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" podStartSLOduration=1.371123429 podStartE2EDuration="1.371123429s" podCreationTimestamp="2026-01-30 18:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:45:01.369979881 +0000 UTC m=+877.638357906" watchObservedRunningTime="2026-01-30 18:45:01.371123429 +0000 UTC m=+877.639501454" Jan 30 18:45:02 crc kubenswrapper[4782]: I0130 18:45:02.365435 4782 generic.go:334] "Generic (PLEG): container finished" podID="c4b16342-989f-4f2b-8eef-1e638aeb7858" containerID="3de6827731405bb2f4c5bcef02cff0f66ee60f560a6480256e0cf7d894c02836" exitCode=0 Jan 30 18:45:02 crc kubenswrapper[4782]: I0130 18:45:02.365503 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" event={"ID":"c4b16342-989f-4f2b-8eef-1e638aeb7858","Type":"ContainerDied","Data":"3de6827731405bb2f4c5bcef02cff0f66ee60f560a6480256e0cf7d894c02836"} Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.672475 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.838729 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgtr5\" (UniqueName: \"kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5\") pod \"c4b16342-989f-4f2b-8eef-1e638aeb7858\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.839205 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume\") pod \"c4b16342-989f-4f2b-8eef-1e638aeb7858\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.839281 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume\") pod \"c4b16342-989f-4f2b-8eef-1e638aeb7858\" (UID: \"c4b16342-989f-4f2b-8eef-1e638aeb7858\") " Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.839691 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume" (OuterVolumeSpecName: "config-volume") pod "c4b16342-989f-4f2b-8eef-1e638aeb7858" (UID: "c4b16342-989f-4f2b-8eef-1e638aeb7858"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.847861 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c4b16342-989f-4f2b-8eef-1e638aeb7858" (UID: "c4b16342-989f-4f2b-8eef-1e638aeb7858"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.851391 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5" (OuterVolumeSpecName: "kube-api-access-wgtr5") pod "c4b16342-989f-4f2b-8eef-1e638aeb7858" (UID: "c4b16342-989f-4f2b-8eef-1e638aeb7858"). InnerVolumeSpecName "kube-api-access-wgtr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.940536 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgtr5\" (UniqueName: \"kubernetes.io/projected/c4b16342-989f-4f2b-8eef-1e638aeb7858-kube-api-access-wgtr5\") on node \"crc\" DevicePath \"\"" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.940574 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4b16342-989f-4f2b-8eef-1e638aeb7858-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 18:45:03 crc kubenswrapper[4782]: I0130 18:45:03.940585 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4b16342-989f-4f2b-8eef-1e638aeb7858-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 18:45:04 crc kubenswrapper[4782]: I0130 18:45:04.382406 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" event={"ID":"c4b16342-989f-4f2b-8eef-1e638aeb7858","Type":"ContainerDied","Data":"57ef45184bab0a769dd94ec7ab09d5686ea9ba3ee536dc88920fe3e5e3ec5d5c"} Jan 30 18:45:04 crc kubenswrapper[4782]: I0130 18:45:04.382447 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ef45184bab0a769dd94ec7ab09d5686ea9ba3ee536dc88920fe3e5e3ec5d5c" Jan 30 18:45:04 crc kubenswrapper[4782]: I0130 18:45:04.382489 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.325817 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l"] Jan 30 18:45:12 crc kubenswrapper[4782]: E0130 18:45:12.326811 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b16342-989f-4f2b-8eef-1e638aeb7858" containerName="collect-profiles" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.326829 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b16342-989f-4f2b-8eef-1e638aeb7858" containerName="collect-profiles" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.327009 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b16342-989f-4f2b-8eef-1e638aeb7858" containerName="collect-profiles" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.327573 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.329080 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n874p" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.336717 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.337476 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.344966 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-48zwd" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.352138 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.366023 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqp2j\" (UniqueName: \"kubernetes.io/projected/9517a543-a9e5-4253-a1b1-4154cf20a70a-kube-api-access-vqp2j\") pod \"cinder-operator-controller-manager-8d874c8fc-wmvtm\" (UID: \"9517a543-a9e5-4253-a1b1-4154cf20a70a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.366297 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4xlj\" (UniqueName: \"kubernetes.io/projected/c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828-kube-api-access-r4xlj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-fmn9l\" (UID: \"c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.373315 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.374379 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.377040 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.381869 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6q5j4" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.390272 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.391615 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.395807 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rvt5c" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.399335 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.404115 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.423587 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.424302 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.429249 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.445958 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.447441 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6bgh9" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.449862 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mvf69" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.451129 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.478962 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.516287 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4v2h\" (UniqueName: \"kubernetes.io/projected/cd676b0f-9e48-461d-8381-998645228b54-kube-api-access-n4v2h\") pod \"horizon-operator-controller-manager-5fb775575f-745gl\" (UID: \"cd676b0f-9e48-461d-8381-998645228b54\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.520456 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqp2j\" (UniqueName: \"kubernetes.io/projected/9517a543-a9e5-4253-a1b1-4154cf20a70a-kube-api-access-vqp2j\") pod \"cinder-operator-controller-manager-8d874c8fc-wmvtm\" (UID: \"9517a543-a9e5-4253-a1b1-4154cf20a70a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.520651 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4xlj\" (UniqueName: \"kubernetes.io/projected/c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828-kube-api-access-r4xlj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-fmn9l\" (UID: \"c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.543088 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.543990 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.546418 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.547181 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4xlj\" (UniqueName: \"kubernetes.io/projected/c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828-kube-api-access-r4xlj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-fmn9l\" (UID: \"c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.547798 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqp2j\" (UniqueName: \"kubernetes.io/projected/9517a543-a9e5-4253-a1b1-4154cf20a70a-kube-api-access-vqp2j\") pod \"cinder-operator-controller-manager-8d874c8fc-wmvtm\" (UID: \"9517a543-a9e5-4253-a1b1-4154cf20a70a\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.564740 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8t2vv" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.586324 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.590471 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.592397 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.598847 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.600030 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rdct6" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.606845 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.607787 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.608915 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-g4k4z" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.614674 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.615579 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.620138 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5hqkq" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.621145 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625140 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgr8d\" (UniqueName: \"kubernetes.io/projected/8b27955a-e2c6-43eb-953e-af3d66a687e3-kube-api-access-kgr8d\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625189 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmdn\" (UniqueName: \"kubernetes.io/projected/2ca6290f-bb8e-484d-84bd-d9e66b9f1471-kube-api-access-rsmdn\") pod \"keystone-operator-controller-manager-84f48565d4-4jb22\" (UID: \"2ca6290f-bb8e-484d-84bd-d9e66b9f1471\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625291 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9nqf\" (UniqueName: \"kubernetes.io/projected/d82d84b6-3009-480d-b614-fbd420d90f0e-kube-api-access-d9nqf\") pod \"designate-operator-controller-manager-6d9697b7f4-txtbj\" (UID: \"d82d84b6-3009-480d-b614-fbd420d90f0e\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625320 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625343 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb64n\" (UniqueName: \"kubernetes.io/projected/1cb2fc09-3cbc-4cee-8a31-04a050d8ff04-kube-api-access-jb64n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-h4kqr\" (UID: \"1cb2fc09-3cbc-4cee-8a31-04a050d8ff04\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625380 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4v2h\" (UniqueName: \"kubernetes.io/projected/cd676b0f-9e48-461d-8381-998645228b54-kube-api-access-n4v2h\") pod \"horizon-operator-controller-manager-5fb775575f-745gl\" (UID: \"cd676b0f-9e48-461d-8381-998645228b54\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625398 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcf4d\" (UniqueName: \"kubernetes.io/projected/f03fb99f-3277-4bff-bcd2-93756326af54-kube-api-access-fcf4d\") pod \"heat-operator-controller-manager-69d6db494d-gdx26\" (UID: \"f03fb99f-3277-4bff-bcd2-93756326af54\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625415 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjc9\" (UniqueName: \"kubernetes.io/projected/8ec19937-0358-40cb-9fc0-de54ba844b62-kube-api-access-wtjc9\") pod \"manila-operator-controller-manager-7dd968899f-ffbz8\" (UID: \"8ec19937-0358-40cb-9fc0-de54ba844b62\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625437 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t68wg\" (UniqueName: \"kubernetes.io/projected/f55acdec-57ab-4e5d-97df-ac13e7b749da-kube-api-access-t68wg\") pod \"glance-operator-controller-manager-8886f4c47-v85zd\" (UID: \"f55acdec-57ab-4e5d-97df-ac13e7b749da\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.625760 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.632839 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.633601 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.634809 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-h8z5l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.641059 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.641985 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.643524 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c7rjj" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.648657 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4v2h\" (UniqueName: \"kubernetes.io/projected/cd676b0f-9e48-461d-8381-998645228b54-kube-api-access-n4v2h\") pod \"horizon-operator-controller-manager-5fb775575f-745gl\" (UID: \"cd676b0f-9e48-461d-8381-998645228b54\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.648987 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.651839 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.657413 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.661510 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.669099 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-78nps"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.669993 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.671614 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5cwpp" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.674342 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.675286 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.678925 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hs8gw" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.682179 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-78nps"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.692333 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.698115 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.699792 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.701614 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-4vkhl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.705845 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.706973 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.710288 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-wsmqq" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.716523 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.722128 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.725081 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-94zvw" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.725223 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731032 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb64n\" (UniqueName: \"kubernetes.io/projected/1cb2fc09-3cbc-4cee-8a31-04a050d8ff04-kube-api-access-jb64n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-h4kqr\" (UID: \"1cb2fc09-3cbc-4cee-8a31-04a050d8ff04\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731106 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcf4d\" (UniqueName: \"kubernetes.io/projected/f03fb99f-3277-4bff-bcd2-93756326af54-kube-api-access-fcf4d\") pod \"heat-operator-controller-manager-69d6db494d-gdx26\" (UID: \"f03fb99f-3277-4bff-bcd2-93756326af54\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731125 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjc9\" (UniqueName: \"kubernetes.io/projected/8ec19937-0358-40cb-9fc0-de54ba844b62-kube-api-access-wtjc9\") pod \"manila-operator-controller-manager-7dd968899f-ffbz8\" (UID: \"8ec19937-0358-40cb-9fc0-de54ba844b62\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731158 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t68wg\" (UniqueName: \"kubernetes.io/projected/f55acdec-57ab-4e5d-97df-ac13e7b749da-kube-api-access-t68wg\") pod \"glance-operator-controller-manager-8886f4c47-v85zd\" (UID: \"f55acdec-57ab-4e5d-97df-ac13e7b749da\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731178 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgr8d\" (UniqueName: \"kubernetes.io/projected/8b27955a-e2c6-43eb-953e-af3d66a687e3-kube-api-access-kgr8d\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731224 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmdn\" (UniqueName: \"kubernetes.io/projected/2ca6290f-bb8e-484d-84bd-d9e66b9f1471-kube-api-access-rsmdn\") pod \"keystone-operator-controller-manager-84f48565d4-4jb22\" (UID: \"2ca6290f-bb8e-484d-84bd-d9e66b9f1471\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731276 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntddd\" (UniqueName: \"kubernetes.io/projected/ccfec61d-1461-4d91-a834-3170c98cf92f-kube-api-access-ntddd\") pod \"mariadb-operator-controller-manager-67bf948998-94rpc\" (UID: \"ccfec61d-1461-4d91-a834-3170c98cf92f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731327 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9nqf\" (UniqueName: \"kubernetes.io/projected/d82d84b6-3009-480d-b614-fbd420d90f0e-kube-api-access-d9nqf\") pod \"designate-operator-controller-manager-6d9697b7f4-txtbj\" (UID: \"d82d84b6-3009-480d-b614-fbd420d90f0e\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.731359 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: E0130 18:45:12.731466 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:12 crc kubenswrapper[4782]: E0130 18:45:12.731510 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:13.231494945 +0000 UTC m=+889.499872970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.742985 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.755797 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb64n\" (UniqueName: \"kubernetes.io/projected/1cb2fc09-3cbc-4cee-8a31-04a050d8ff04-kube-api-access-jb64n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-h4kqr\" (UID: \"1cb2fc09-3cbc-4cee-8a31-04a050d8ff04\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.757457 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgr8d\" (UniqueName: \"kubernetes.io/projected/8b27955a-e2c6-43eb-953e-af3d66a687e3-kube-api-access-kgr8d\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.757627 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9nqf\" (UniqueName: \"kubernetes.io/projected/d82d84b6-3009-480d-b614-fbd420d90f0e-kube-api-access-d9nqf\") pod \"designate-operator-controller-manager-6d9697b7f4-txtbj\" (UID: \"d82d84b6-3009-480d-b614-fbd420d90f0e\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.758537 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.759902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t68wg\" (UniqueName: \"kubernetes.io/projected/f55acdec-57ab-4e5d-97df-ac13e7b749da-kube-api-access-t68wg\") pod \"glance-operator-controller-manager-8886f4c47-v85zd\" (UID: \"f55acdec-57ab-4e5d-97df-ac13e7b749da\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.761179 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmdn\" (UniqueName: \"kubernetes.io/projected/2ca6290f-bb8e-484d-84bd-d9e66b9f1471-kube-api-access-rsmdn\") pod \"keystone-operator-controller-manager-84f48565d4-4jb22\" (UID: \"2ca6290f-bb8e-484d-84bd-d9e66b9f1471\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.764386 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjc9\" (UniqueName: \"kubernetes.io/projected/8ec19937-0358-40cb-9fc0-de54ba844b62-kube-api-access-wtjc9\") pod \"manila-operator-controller-manager-7dd968899f-ffbz8\" (UID: \"8ec19937-0358-40cb-9fc0-de54ba844b62\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.767244 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcf4d\" (UniqueName: \"kubernetes.io/projected/f03fb99f-3277-4bff-bcd2-93756326af54-kube-api-access-fcf4d\") pod \"heat-operator-controller-manager-69d6db494d-gdx26\" (UID: \"f03fb99f-3277-4bff-bcd2-93756326af54\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.769927 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.775016 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.775878 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.778768 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5bgh4" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.783547 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.783789 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.827457 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.835833 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.835882 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nlcw\" (UniqueName: \"kubernetes.io/projected/097a5bf9-6be5-4d4e-9547-f1318371e9db-kube-api-access-8nlcw\") pod \"ovn-operator-controller-manager-788c46999f-lqrfz\" (UID: \"097a5bf9-6be5-4d4e-9547-f1318371e9db\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.835967 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8lz\" (UniqueName: \"kubernetes.io/projected/c212e215-248f-4b93-9a70-b352f425648c-kube-api-access-cv8lz\") pod \"placement-operator-controller-manager-5b964cf4cd-vsmt2\" (UID: \"c212e215-248f-4b93-9a70-b352f425648c\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.835993 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j69b\" (UniqueName: \"kubernetes.io/projected/acd35126-a27d-4b4c-b56b-04ebd8358c74-kube-api-access-2j69b\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.836087 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6sv\" (UniqueName: \"kubernetes.io/projected/79d06938-56c3-4ec4-a455-0fde260d8cdd-kube-api-access-lh6sv\") pod \"neutron-operator-controller-manager-585dbc889-45lw5\" (UID: \"79d06938-56c3-4ec4-a455-0fde260d8cdd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.836118 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntddd\" (UniqueName: \"kubernetes.io/projected/ccfec61d-1461-4d91-a834-3170c98cf92f-kube-api-access-ntddd\") pod \"mariadb-operator-controller-manager-67bf948998-94rpc\" (UID: \"ccfec61d-1461-4d91-a834-3170c98cf92f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.836277 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c2w\" (UniqueName: \"kubernetes.io/projected/0301eb58-f901-4952-9f7e-7764c0e67d7f-kube-api-access-84c2w\") pod \"octavia-operator-controller-manager-6687f8d877-8phk5\" (UID: \"0301eb58-f901-4952-9f7e-7764c0e67d7f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.836308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phknf\" (UniqueName: \"kubernetes.io/projected/5a54baf5-b3a2-4417-8caf-8fe321ff5f5f-kube-api-access-phknf\") pod \"nova-operator-controller-manager-55bff696bd-78nps\" (UID: \"5a54baf5-b3a2-4417-8caf-8fe321ff5f5f\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.843644 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.849893 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.851290 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.854847 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-x22js" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.871124 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntddd\" (UniqueName: \"kubernetes.io/projected/ccfec61d-1461-4d91-a834-3170c98cf92f-kube-api-access-ntddd\") pod \"mariadb-operator-controller-manager-67bf948998-94rpc\" (UID: \"ccfec61d-1461-4d91-a834-3170c98cf92f\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.932205 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.933105 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.933756 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.936483 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nwrr7" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.937353 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941495 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8lz\" (UniqueName: \"kubernetes.io/projected/c212e215-248f-4b93-9a70-b352f425648c-kube-api-access-cv8lz\") pod \"placement-operator-controller-manager-5b964cf4cd-vsmt2\" (UID: \"c212e215-248f-4b93-9a70-b352f425648c\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941535 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j69b\" (UniqueName: \"kubernetes.io/projected/acd35126-a27d-4b4c-b56b-04ebd8358c74-kube-api-access-2j69b\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941562 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkzm\" (UniqueName: \"kubernetes.io/projected/52ac09b6-ec41-4ebc-ac18-018794fab085-kube-api-access-dfkzm\") pod \"swift-operator-controller-manager-68fc8c869-rmkdk\" (UID: \"52ac09b6-ec41-4ebc-ac18-018794fab085\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6sv\" (UniqueName: \"kubernetes.io/projected/79d06938-56c3-4ec4-a455-0fde260d8cdd-kube-api-access-lh6sv\") pod \"neutron-operator-controller-manager-585dbc889-45lw5\" (UID: \"79d06938-56c3-4ec4-a455-0fde260d8cdd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941632 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84c2w\" (UniqueName: \"kubernetes.io/projected/0301eb58-f901-4952-9f7e-7764c0e67d7f-kube-api-access-84c2w\") pod \"octavia-operator-controller-manager-6687f8d877-8phk5\" (UID: \"0301eb58-f901-4952-9f7e-7764c0e67d7f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941657 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phknf\" (UniqueName: \"kubernetes.io/projected/5a54baf5-b3a2-4417-8caf-8fe321ff5f5f-kube-api-access-phknf\") pod \"nova-operator-controller-manager-55bff696bd-78nps\" (UID: \"5a54baf5-b3a2-4417-8caf-8fe321ff5f5f\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941718 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.941737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nlcw\" (UniqueName: \"kubernetes.io/projected/097a5bf9-6be5-4d4e-9547-f1318371e9db-kube-api-access-8nlcw\") pod \"ovn-operator-controller-manager-788c46999f-lqrfz\" (UID: \"097a5bf9-6be5-4d4e-9547-f1318371e9db\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:12 crc kubenswrapper[4782]: E0130 18:45:12.948054 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:12 crc kubenswrapper[4782]: E0130 18:45:12.948121 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:13.448102361 +0000 UTC m=+889.716480386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.950771 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.960890 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.964067 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.969763 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bnczv" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.970148 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c2w\" (UniqueName: \"kubernetes.io/projected/0301eb58-f901-4952-9f7e-7764c0e67d7f-kube-api-access-84c2w\") pod \"octavia-operator-controller-manager-6687f8d877-8phk5\" (UID: \"0301eb58-f901-4952-9f7e-7764c0e67d7f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.971845 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nlcw\" (UniqueName: \"kubernetes.io/projected/097a5bf9-6be5-4d4e-9547-f1318371e9db-kube-api-access-8nlcw\") pod \"ovn-operator-controller-manager-788c46999f-lqrfz\" (UID: \"097a5bf9-6be5-4d4e-9547-f1318371e9db\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.972424 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6sv\" (UniqueName: \"kubernetes.io/projected/79d06938-56c3-4ec4-a455-0fde260d8cdd-kube-api-access-lh6sv\") pod \"neutron-operator-controller-manager-585dbc889-45lw5\" (UID: \"79d06938-56c3-4ec4-a455-0fde260d8cdd\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.973833 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8lz\" (UniqueName: \"kubernetes.io/projected/c212e215-248f-4b93-9a70-b352f425648c-kube-api-access-cv8lz\") pod \"placement-operator-controller-manager-5b964cf4cd-vsmt2\" (UID: \"c212e215-248f-4b93-9a70-b352f425648c\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.974821 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz"] Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.975013 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j69b\" (UniqueName: \"kubernetes.io/projected/acd35126-a27d-4b4c-b56b-04ebd8358c74-kube-api-access-2j69b\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:12 crc kubenswrapper[4782]: I0130 18:45:12.985146 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phknf\" (UniqueName: \"kubernetes.io/projected/5a54baf5-b3a2-4417-8caf-8fe321ff5f5f-kube-api-access-phknf\") pod \"nova-operator-controller-manager-55bff696bd-78nps\" (UID: \"5a54baf5-b3a2-4417-8caf-8fe321ff5f5f\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.010734 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.029169 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.035313 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.042999 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrn2h\" (UniqueName: \"kubernetes.io/projected/1794e6a9-01aa-43b7-841d-ca7bc24950f8-kube-api-access-xrn2h\") pod \"test-operator-controller-manager-56f8bfcd9f-brw4k\" (UID: \"1794e6a9-01aa-43b7-841d-ca7bc24950f8\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.043044 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkzm\" (UniqueName: \"kubernetes.io/projected/52ac09b6-ec41-4ebc-ac18-018794fab085-kube-api-access-dfkzm\") pod \"swift-operator-controller-manager-68fc8c869-rmkdk\" (UID: \"52ac09b6-ec41-4ebc-ac18-018794fab085\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.043077 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6scg\" (UniqueName: \"kubernetes.io/projected/f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3-kube-api-access-b6scg\") pod \"telemetry-operator-controller-manager-64b5b76f97-khwrr\" (UID: \"f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.051567 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.056808 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.057631 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.066513 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.066561 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.066518 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fwsgq" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.071836 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.072094 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkzm\" (UniqueName: \"kubernetes.io/projected/52ac09b6-ec41-4ebc-ac18-018794fab085-kube-api-access-dfkzm\") pod \"swift-operator-controller-manager-68fc8c869-rmkdk\" (UID: \"52ac09b6-ec41-4ebc-ac18-018794fab085\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.081279 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.098078 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.108003 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.130516 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.131690 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.134997 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-74ggr" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.148040 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.149520 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrn2h\" (UniqueName: \"kubernetes.io/projected/1794e6a9-01aa-43b7-841d-ca7bc24950f8-kube-api-access-xrn2h\") pod \"test-operator-controller-manager-56f8bfcd9f-brw4k\" (UID: \"1794e6a9-01aa-43b7-841d-ca7bc24950f8\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.149550 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln96k\" (UniqueName: \"kubernetes.io/projected/a765979e-db86-4d07-8a0a-96c61d42137c-kube-api-access-ln96k\") pod \"watcher-operator-controller-manager-78c8444fdd-928lz\" (UID: \"a765979e-db86-4d07-8a0a-96c61d42137c\") " pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.149604 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6scg\" (UniqueName: \"kubernetes.io/projected/f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3-kube-api-access-b6scg\") pod \"telemetry-operator-controller-manager-64b5b76f97-khwrr\" (UID: \"f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.155388 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.167864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrn2h\" (UniqueName: \"kubernetes.io/projected/1794e6a9-01aa-43b7-841d-ca7bc24950f8-kube-api-access-xrn2h\") pod \"test-operator-controller-manager-56f8bfcd9f-brw4k\" (UID: \"1794e6a9-01aa-43b7-841d-ca7bc24950f8\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.173264 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6scg\" (UniqueName: \"kubernetes.io/projected/f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3-kube-api-access-b6scg\") pod \"telemetry-operator-controller-manager-64b5b76f97-khwrr\" (UID: \"f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.173596 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.191443 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.201011 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.210424 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.210634 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.212136 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251040 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvx4z\" (UniqueName: \"kubernetes.io/projected/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-kube-api-access-lvx4z\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251131 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251185 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/fcdbdda2-62ba-4df8-9885-78c31d1e6157-kube-api-access-2hgp7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8jcf8\" (UID: \"fcdbdda2-62ba-4df8-9885-78c31d1e6157\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251207 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251272 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln96k\" (UniqueName: \"kubernetes.io/projected/a765979e-db86-4d07-8a0a-96c61d42137c-kube-api-access-ln96k\") pod \"watcher-operator-controller-manager-78c8444fdd-928lz\" (UID: \"a765979e-db86-4d07-8a0a-96c61d42137c\") " pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.251304 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.251498 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.251555 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:14.251524207 +0000 UTC m=+890.519902232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.268410 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.271178 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.273883 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln96k\" (UniqueName: \"kubernetes.io/projected/a765979e-db86-4d07-8a0a-96c61d42137c-kube-api-access-ln96k\") pod \"watcher-operator-controller-manager-78c8444fdd-928lz\" (UID: \"a765979e-db86-4d07-8a0a-96c61d42137c\") " pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.285539 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.352563 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/fcdbdda2-62ba-4df8-9885-78c31d1e6157-kube-api-access-2hgp7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8jcf8\" (UID: \"fcdbdda2-62ba-4df8-9885-78c31d1e6157\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.352600 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.352644 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.352690 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvx4z\" (UniqueName: \"kubernetes.io/projected/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-kube-api-access-lvx4z\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.352774 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.352833 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.352848 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:13.852832977 +0000 UTC m=+890.121211002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.352944 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:13.852918489 +0000 UTC m=+890.121296514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.379339 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvx4z\" (UniqueName: \"kubernetes.io/projected/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-kube-api-access-lvx4z\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.387045 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgp7\" (UniqueName: \"kubernetes.io/projected/fcdbdda2-62ba-4df8-9885-78c31d1e6157-kube-api-access-2hgp7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8jcf8\" (UID: \"fcdbdda2-62ba-4df8-9885-78c31d1e6157\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.464535 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.465385 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.465559 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.465601 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:14.46558844 +0000 UTC m=+890.733966465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.510209 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.536713 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.595401 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" event={"ID":"f03fb99f-3277-4bff-bcd2-93756326af54","Type":"ContainerStarted","Data":"b625bb075c5e8cd81d15a4a0c677a564911040353dd501744eb65e282141c197"} Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.624600 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" event={"ID":"c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828","Type":"ContainerStarted","Data":"a65c4d675885a286d396265898bab4473141e4fe9d9638530fac1ec5e3984530"} Jan 30 18:45:13 crc kubenswrapper[4782]: W0130 18:45:13.628340 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd676b0f_9e48_461d_8381_998645228b54.slice/crio-856867c6a3a88704ed160edfa9797bf49912c76f1ed13be91e7e6ba409c3546c WatchSource:0}: Error finding container 856867c6a3a88704ed160edfa9797bf49912c76f1ed13be91e7e6ba409c3546c: Status 404 returned error can't find the container with id 856867c6a3a88704ed160edfa9797bf49912c76f1ed13be91e7e6ba409c3546c Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.641706 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" event={"ID":"9517a543-a9e5-4253-a1b1-4154cf20a70a","Type":"ContainerStarted","Data":"9322ae57ee22333b926ff002c40667ab0e104073c0b930644f19ebc0d9e2d517"} Jan 30 18:45:13 crc kubenswrapper[4782]: W0130 18:45:13.643524 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82d84b6_3009_480d_b614_fbd420d90f0e.slice/crio-a00ea2992397d4f9617e0b2bd3e26bf0a23dac11a962ff177a82eb86037ecc6b WatchSource:0}: Error finding container a00ea2992397d4f9617e0b2bd3e26bf0a23dac11a962ff177a82eb86037ecc6b: Status 404 returned error can't find the container with id a00ea2992397d4f9617e0b2bd3e26bf0a23dac11a962ff177a82eb86037ecc6b Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.645599 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr"] Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.651370 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22"] Jan 30 18:45:13 crc kubenswrapper[4782]: W0130 18:45:13.687478 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ca6290f_bb8e_484d_84bd_d9e66b9f1471.slice/crio-da408d8ca3775074c5f486c4139a20e836ae9988a7c8c384e4e6bcb7f61ee740 WatchSource:0}: Error finding container da408d8ca3775074c5f486c4139a20e836ae9988a7c8c384e4e6bcb7f61ee740: Status 404 returned error can't find the container with id da408d8ca3775074c5f486c4139a20e836ae9988a7c8c384e4e6bcb7f61ee740 Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.853994 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc"] Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.872156 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.872327 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:14.872300215 +0000 UTC m=+891.140678240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.871939 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.872865 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.873073 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: E0130 18:45:13.873116 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:14.873106785 +0000 UTC m=+891.141484810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:13 crc kubenswrapper[4782]: I0130 18:45:13.876912 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd"] Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.222377 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0301eb58_f901_4952_9f7e_7764c0e67d7f.slice/crio-dc8a9cde16a533e3fed3700dfe2594a88d73e9f1d5e574f0b6172cfea2b3d59a WatchSource:0}: Error finding container dc8a9cde16a533e3fed3700dfe2594a88d73e9f1d5e574f0b6172cfea2b3d59a: Status 404 returned error can't find the container with id dc8a9cde16a533e3fed3700dfe2594a88d73e9f1d5e574f0b6172cfea2b3d59a Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.225620 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.237625 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8"] Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.255103 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec19937_0358_40cb_9fc0_de54ba844b62.slice/crio-f79bb83bfd826a4ac05447ea339ec369840157ef425702546b240d68757d1f5e WatchSource:0}: Error finding container f79bb83bfd826a4ac05447ea339ec369840157ef425702546b240d68757d1f5e: Status 404 returned error can't find the container with id f79bb83bfd826a4ac05447ea339ec369840157ef425702546b240d68757d1f5e Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.285582 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.286031 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.286078 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:16.286064294 +0000 UTC m=+892.554442309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.298463 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-78nps"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.325306 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5"] Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.327499 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d06938_56c3_4ec4_a455_0fde260d8cdd.slice/crio-330ef9909f5af702cf04cb5a63b61c6860e098eee5143563726227beef537d6f WatchSource:0}: Error finding container 330ef9909f5af702cf04cb5a63b61c6860e098eee5143563726227beef537d6f: Status 404 returned error can't find the container with id 330ef9909f5af702cf04cb5a63b61c6860e098eee5143563726227beef537d6f Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.329794 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1794e6a9_01aa_43b7_841d_ca7bc24950f8.slice/crio-7e096600ef28965a9ec0af92c19f82d1eee2dd45ab1d4e227007caf2384ac771 WatchSource:0}: Error finding container 7e096600ef28965a9ec0af92c19f82d1eee2dd45ab1d4e227007caf2384ac771: Status 404 returned error can't find the container with id 7e096600ef28965a9ec0af92c19f82d1eee2dd45ab1d4e227007caf2384ac771 Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.344523 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a54baf5_b3a2_4417_8caf_8fe321ff5f5f.slice/crio-ba2441dc5ae2e4c01dfed50dc5a52648de6e802e96f3cb093f6892537fa7570c WatchSource:0}: Error finding container ba2441dc5ae2e4c01dfed50dc5a52648de6e802e96f3cb093f6892537fa7570c: Status 404 returned error can't find the container with id ba2441dc5ae2e4c01dfed50dc5a52648de6e802e96f3cb093f6892537fa7570c Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.345876 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.354549 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.438622 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.445518 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz"] Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.450808 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0dd10e9_3f32_401a_a1d2_ba3e2ac503b3.slice/crio-63b568d377bac05b993f59d75a4c7bb28e84f9834eeefe51ae0af74bb14e7330 WatchSource:0}: Error finding container 63b568d377bac05b993f59d75a4c7bb28e84f9834eeefe51ae0af74bb14e7330: Status 404 returned error can't find the container with id 63b568d377bac05b993f59d75a4c7bb28e84f9834eeefe51ae0af74bb14e7330 Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.452668 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcdbdda2_62ba_4df8_9885_78c31d1e6157.slice/crio-847f69251dab844bfcabed8debe7a2df4c31425f27ec6d79a0f819b473473894 WatchSource:0}: Error finding container 847f69251dab844bfcabed8debe7a2df4c31425f27ec6d79a0f819b473473894: Status 404 returned error can't find the container with id 847f69251dab844bfcabed8debe7a2df4c31425f27ec6d79a0f819b473473894 Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.456056 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6scg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-khwrr_openstack-operators(f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.457311 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" podUID="f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3" Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.458920 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc212e215_248f_4b93_9a70_b352f425648c.slice/crio-a6d18429cca17882caa80002f69e15d212c486667aef320485aab6e938d2d1b0 WatchSource:0}: Error finding container a6d18429cca17882caa80002f69e15d212c486667aef320485aab6e938d2d1b0: Status 404 returned error can't find the container with id a6d18429cca17882caa80002f69e15d212c486667aef320485aab6e938d2d1b0 Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.459361 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8"] Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.460266 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda765979e_db86_4d07_8a0a_96c61d42137c.slice/crio-eb1d7827908dc32d5df509672ba455cf745655d62e53493790acf8b3396620f1 WatchSource:0}: Error finding container eb1d7827908dc32d5df509672ba455cf745655d62e53493790acf8b3396620f1: Status 404 returned error can't find the container with id eb1d7827908dc32d5df509672ba455cf745655d62e53493790acf8b3396620f1 Jan 30 18:45:14 crc kubenswrapper[4782]: W0130 18:45:14.462167 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ac09b6_ec41_4ebc_ac18_018794fab085.slice/crio-837343de60442785147db04866da3394887aec297920c7d4bc1082c3163b040e WatchSource:0}: Error finding container 837343de60442785147db04866da3394887aec297920c7d4bc1082c3163b040e: Status 404 returned error can't find the container with id 837343de60442785147db04866da3394887aec297920c7d4bc1082c3163b040e Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.462554 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cv8lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-vsmt2_openstack-operators(c212e215-248f-4b93-9a70-b352f425648c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.463462 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ln96k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-78c8444fdd-928lz_openstack-operators(a765979e-db86-4d07-8a0a-96c61d42137c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.463684 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" podUID="c212e215-248f-4b93-9a70-b352f425648c" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.463943 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr"] Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.464704 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podUID="a765979e-db86-4d07-8a0a-96c61d42137c" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.464768 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfkzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-rmkdk_openstack-operators(52ac09b6-ec41-4ebc-ac18-018794fab085): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.476609 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" podUID="52ac09b6-ec41-4ebc-ac18-018794fab085" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.480842 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2"] Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.488779 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.488967 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.489010 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:16.48899503 +0000 UTC m=+892.757373045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.653079 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" event={"ID":"fcdbdda2-62ba-4df8-9885-78c31d1e6157","Type":"ContainerStarted","Data":"847f69251dab844bfcabed8debe7a2df4c31425f27ec6d79a0f819b473473894"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.658426 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" event={"ID":"f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3","Type":"ContainerStarted","Data":"63b568d377bac05b993f59d75a4c7bb28e84f9834eeefe51ae0af74bb14e7330"} Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.663026 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" podUID="f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.665478 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" event={"ID":"2ca6290f-bb8e-484d-84bd-d9e66b9f1471","Type":"ContainerStarted","Data":"da408d8ca3775074c5f486c4139a20e836ae9988a7c8c384e4e6bcb7f61ee740"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.669165 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" event={"ID":"1794e6a9-01aa-43b7-841d-ca7bc24950f8","Type":"ContainerStarted","Data":"7e096600ef28965a9ec0af92c19f82d1eee2dd45ab1d4e227007caf2384ac771"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.671522 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" event={"ID":"0301eb58-f901-4952-9f7e-7764c0e67d7f","Type":"ContainerStarted","Data":"dc8a9cde16a533e3fed3700dfe2594a88d73e9f1d5e574f0b6172cfea2b3d59a"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.673126 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" event={"ID":"f55acdec-57ab-4e5d-97df-ac13e7b749da","Type":"ContainerStarted","Data":"d35da12da700786eb195bffd7d88a46f44972663f7bc067f40ebdc15d1a6b531"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.674501 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" event={"ID":"8ec19937-0358-40cb-9fc0-de54ba844b62","Type":"ContainerStarted","Data":"f79bb83bfd826a4ac05447ea339ec369840157ef425702546b240d68757d1f5e"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.675833 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" event={"ID":"1cb2fc09-3cbc-4cee-8a31-04a050d8ff04","Type":"ContainerStarted","Data":"9b9884fa6d27069c44f82b36abec1980783abab20346a22202e6fc57190022a3"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.676922 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" event={"ID":"ccfec61d-1461-4d91-a834-3170c98cf92f","Type":"ContainerStarted","Data":"f13573544c3774c3af59271b82a13ff96a8b65ae35413d59aabff95cb404ba9b"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.678249 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" event={"ID":"cd676b0f-9e48-461d-8381-998645228b54","Type":"ContainerStarted","Data":"856867c6a3a88704ed160edfa9797bf49912c76f1ed13be91e7e6ba409c3546c"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.679963 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" event={"ID":"5a54baf5-b3a2-4417-8caf-8fe321ff5f5f","Type":"ContainerStarted","Data":"ba2441dc5ae2e4c01dfed50dc5a52648de6e802e96f3cb093f6892537fa7570c"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.681013 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" event={"ID":"a765979e-db86-4d07-8a0a-96c61d42137c","Type":"ContainerStarted","Data":"eb1d7827908dc32d5df509672ba455cf745655d62e53493790acf8b3396620f1"} Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.683749 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podUID="a765979e-db86-4d07-8a0a-96c61d42137c" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.685486 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" event={"ID":"d82d84b6-3009-480d-b614-fbd420d90f0e","Type":"ContainerStarted","Data":"a00ea2992397d4f9617e0b2bd3e26bf0a23dac11a962ff177a82eb86037ecc6b"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.696563 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" event={"ID":"097a5bf9-6be5-4d4e-9547-f1318371e9db","Type":"ContainerStarted","Data":"e46bee59e7538264ec6ca4ba2d2e5016f10e20ced7a1fd826a4922087bc66fe5"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.698803 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" event={"ID":"c212e215-248f-4b93-9a70-b352f425648c","Type":"ContainerStarted","Data":"a6d18429cca17882caa80002f69e15d212c486667aef320485aab6e938d2d1b0"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.704583 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" event={"ID":"52ac09b6-ec41-4ebc-ac18-018794fab085","Type":"ContainerStarted","Data":"837343de60442785147db04866da3394887aec297920c7d4bc1082c3163b040e"} Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.709094 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" podUID="c212e215-248f-4b93-9a70-b352f425648c" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.719178 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" podUID="52ac09b6-ec41-4ebc-ac18-018794fab085" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.781694 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" event={"ID":"79d06938-56c3-4ec4-a455-0fde260d8cdd","Type":"ContainerStarted","Data":"330ef9909f5af702cf04cb5a63b61c6860e098eee5143563726227beef537d6f"} Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.899955 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:14 crc kubenswrapper[4782]: I0130 18:45:14.900046 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.900206 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.900288 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:16.900268528 +0000 UTC m=+893.168646553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.900212 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:14 crc kubenswrapper[4782]: E0130 18:45:14.900391 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:16.900367591 +0000 UTC m=+893.168745706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:15 crc kubenswrapper[4782]: E0130 18:45:15.838320 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" podUID="f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3" Jan 30 18:45:15 crc kubenswrapper[4782]: E0130 18:45:15.838408 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" podUID="c212e215-248f-4b93-9a70-b352f425648c" Jan 30 18:45:15 crc kubenswrapper[4782]: E0130 18:45:15.838507 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" podUID="52ac09b6-ec41-4ebc-ac18-018794fab085" Jan 30 18:45:15 crc kubenswrapper[4782]: E0130 18:45:15.838550 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podUID="a765979e-db86-4d07-8a0a-96c61d42137c" Jan 30 18:45:16 crc kubenswrapper[4782]: I0130 18:45:16.327330 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.327623 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.327683 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:20.327666536 +0000 UTC m=+896.596044561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: I0130 18:45:16.531097 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.531282 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.531326 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:20.53131295 +0000 UTC m=+896.799690975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: I0130 18:45:16.937905 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:16 crc kubenswrapper[4782]: I0130 18:45:16.938205 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.938125 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.938341 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:20.938318582 +0000 UTC m=+897.206696677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.938520 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:16 crc kubenswrapper[4782]: E0130 18:45:16.938612 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:20.938592559 +0000 UTC m=+897.206970574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:20 crc kubenswrapper[4782]: I0130 18:45:20.401483 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:20 crc kubenswrapper[4782]: E0130 18:45:20.401760 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:20 crc kubenswrapper[4782]: E0130 18:45:20.401921 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:28.401901998 +0000 UTC m=+904.670280013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:20 crc kubenswrapper[4782]: I0130 18:45:20.606789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:20 crc kubenswrapper[4782]: E0130 18:45:20.607080 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:20 crc kubenswrapper[4782]: E0130 18:45:20.607219 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:28.607188414 +0000 UTC m=+904.875566439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:21 crc kubenswrapper[4782]: I0130 18:45:21.012552 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:21 crc kubenswrapper[4782]: E0130 18:45:21.012755 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:21 crc kubenswrapper[4782]: E0130 18:45:21.012837 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:29.012816931 +0000 UTC m=+905.281194966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:21 crc kubenswrapper[4782]: I0130 18:45:21.013406 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:21 crc kubenswrapper[4782]: E0130 18:45:21.014125 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:21 crc kubenswrapper[4782]: E0130 18:45:21.014265 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:29.014215906 +0000 UTC m=+905.282594021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.639462 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.648653 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.652309 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.693046 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxsvx\" (UniqueName: \"kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.693090 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.693119 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.794447 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxsvx\" (UniqueName: \"kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.794511 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.794558 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.795302 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.795330 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.834771 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxsvx\" (UniqueName: \"kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx\") pod \"certified-operators-45s4h\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:24 crc kubenswrapper[4782]: I0130 18:45:24.990086 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:27 crc kubenswrapper[4782]: E0130 18:45:27.865391 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Jan 30 18:45:27 crc kubenswrapper[4782]: E0130 18:45:27.866088 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtjc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-ffbz8_openstack-operators(8ec19937-0358-40cb-9fc0-de54ba844b62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:27 crc kubenswrapper[4782]: E0130 18:45:27.867453 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" podUID="8ec19937-0358-40cb-9fc0-de54ba844b62" Jan 30 18:45:27 crc kubenswrapper[4782]: E0130 18:45:27.915830 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" podUID="8ec19937-0358-40cb-9fc0-de54ba844b62" Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.439126 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be" Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.439396 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84c2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-8phk5_openstack-operators(0301eb58-f901-4952-9f7e-7764c0e67d7f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.440567 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" podUID="0301eb58-f901-4952-9f7e-7764c0e67d7f" Jan 30 18:45:28 crc kubenswrapper[4782]: I0130 18:45:28.448603 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.449251 4782 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.449328 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert podName:8b27955a-e2c6-43eb-953e-af3d66a687e3 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:44.4493087 +0000 UTC m=+920.717686725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert") pod "infra-operator-controller-manager-79955696d6-fg8bm" (UID: "8b27955a-e2c6-43eb-953e-af3d66a687e3") : secret "infra-operator-webhook-server-cert" not found Jan 30 18:45:28 crc kubenswrapper[4782]: I0130 18:45:28.653670 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.653860 4782 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.653919 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert podName:acd35126-a27d-4b4c-b56b-04ebd8358c74 nodeName:}" failed. No retries permitted until 2026-01-30 18:45:44.653900278 +0000 UTC m=+920.922278323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" (UID: "acd35126-a27d-4b4c-b56b-04ebd8358c74") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 18:45:28 crc kubenswrapper[4782]: E0130 18:45:28.924129 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" podUID="0301eb58-f901-4952-9f7e-7764c0e67d7f" Jan 30 18:45:29 crc kubenswrapper[4782]: I0130 18:45:29.057670 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:29 crc kubenswrapper[4782]: I0130 18:45:29.057731 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:29 crc kubenswrapper[4782]: E0130 18:45:29.057842 4782 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 18:45:29 crc kubenswrapper[4782]: E0130 18:45:29.057885 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:45.057871785 +0000 UTC m=+921.326249800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "webhook-server-cert" not found Jan 30 18:45:29 crc kubenswrapper[4782]: E0130 18:45:29.058492 4782 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 18:45:29 crc kubenswrapper[4782]: E0130 18:45:29.058521 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs podName:ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c nodeName:}" failed. No retries permitted until 2026-01-30 18:45:45.058512211 +0000 UTC m=+921.326890236 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs") pod "openstack-operator-controller-manager-857dcb78d6-4vgqm" (UID: "ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c") : secret "metrics-server-cert" not found Jan 30 18:45:30 crc kubenswrapper[4782]: E0130 18:45:30.004756 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Jan 30 18:45:30 crc kubenswrapper[4782]: E0130 18:45:30.004939 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rsmdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-4jb22_openstack-operators(2ca6290f-bb8e-484d-84bd-d9e66b9f1471): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:30 crc kubenswrapper[4782]: E0130 18:45:30.006125 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" podUID="2ca6290f-bb8e-484d-84bd-d9e66b9f1471" Jan 30 18:45:30 crc kubenswrapper[4782]: E0130 18:45:30.933434 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" podUID="2ca6290f-bb8e-484d-84bd-d9e66b9f1471" Jan 30 18:45:36 crc kubenswrapper[4782]: E0130 18:45:36.587919 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 30 18:45:36 crc kubenswrapper[4782]: E0130 18:45:36.588571 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2hgp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-8jcf8_openstack-operators(fcdbdda2-62ba-4df8-9885-78c31d1e6157): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:36 crc kubenswrapper[4782]: E0130 18:45:36.592926 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" podUID="fcdbdda2-62ba-4df8-9885-78c31d1e6157" Jan 30 18:45:38 crc kubenswrapper[4782]: E0130 18:45:38.128751 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" podUID="fcdbdda2-62ba-4df8-9885-78c31d1e6157" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.620927 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.621254 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-phknf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-78nps_openstack-operators(5a54baf5-b3a2-4417-8caf-8fe321ff5f5f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.623710 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" podUID="5a54baf5-b3a2-4417-8caf-8fe321ff5f5f" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.720833 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.720897 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.721076 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ln96k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-78c8444fdd-928lz_openstack-operators(a765979e-db86-4d07-8a0a-96c61d42137c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:45:39 crc kubenswrapper[4782]: E0130 18:45:39.722338 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podUID="a765979e-db86-4d07-8a0a-96c61d42137c" Jan 30 18:45:40 crc kubenswrapper[4782]: E0130 18:45:40.004785 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" podUID="5a54baf5-b3a2-4417-8caf-8fe321ff5f5f" Jan 30 18:45:40 crc kubenswrapper[4782]: I0130 18:45:40.368665 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:45:40 crc kubenswrapper[4782]: W0130 18:45:40.420788 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969b9c93_6d7d_4be0_befc_f3b1677b6a96.slice/crio-2b13c3c0a18cd87e0b80fc8592a3b40af172495726eb4b38a0cc0b9245b2b2e7 WatchSource:0}: Error finding container 2b13c3c0a18cd87e0b80fc8592a3b40af172495726eb4b38a0cc0b9245b2b2e7: Status 404 returned error can't find the container with id 2b13c3c0a18cd87e0b80fc8592a3b40af172495726eb4b38a0cc0b9245b2b2e7 Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.011800 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" event={"ID":"f03fb99f-3277-4bff-bcd2-93756326af54","Type":"ContainerStarted","Data":"27a1e4f7133ecf6319345392dae1ac8986ea357a2da8ca47d4c6e3b67159929c"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.012333 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.020263 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" event={"ID":"f55acdec-57ab-4e5d-97df-ac13e7b749da","Type":"ContainerStarted","Data":"2d728d000ebb0523fe5193fd74d36155c3366b79c705fa9ca2a83e650b32cdb1"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.021202 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.022833 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" event={"ID":"d82d84b6-3009-480d-b614-fbd420d90f0e","Type":"ContainerStarted","Data":"8f208a86f1d09acd5b25a139e0d1dfeee7e5965f01fb9bc2e849fb75b3361826"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.023268 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.024697 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" event={"ID":"1cb2fc09-3cbc-4cee-8a31-04a050d8ff04","Type":"ContainerStarted","Data":"9f4df3a1eb068390e0ac2621d1107780d3ca922d87db0d0f64700bb72e0979bc"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.025100 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.034367 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" event={"ID":"ccfec61d-1461-4d91-a834-3170c98cf92f","Type":"ContainerStarted","Data":"2d9a32a2b501a3f8aa4f57e3a57e4bcd685a95c754220512bebc52649caf7e2c"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.034447 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.035874 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" podStartSLOduration=11.185151882 podStartE2EDuration="29.0358549s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.210433549 +0000 UTC m=+889.478811564" lastFinishedPulling="2026-01-30 18:45:31.061136557 +0000 UTC m=+907.329514582" observedRunningTime="2026-01-30 18:45:41.033925642 +0000 UTC m=+917.302303657" watchObservedRunningTime="2026-01-30 18:45:41.0358549 +0000 UTC m=+917.304232925" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.038120 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" event={"ID":"c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828","Type":"ContainerStarted","Data":"380918b393f59324d7e72bb36f5de016d917ef3923713a4770ce85792775a19a"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.038993 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.041069 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" event={"ID":"79d06938-56c3-4ec4-a455-0fde260d8cdd","Type":"ContainerStarted","Data":"2067da015b52dfe95f1c452405c5862da3a6bcfc73591d4e2795156a1cb83347"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.041669 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.043464 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" event={"ID":"9517a543-a9e5-4253-a1b1-4154cf20a70a","Type":"ContainerStarted","Data":"9f35ae7b1fff0d4584df3ec1b5b2a9ff7906ffd7dc2b402cfb908a2a93d7ba9a"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.043637 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.050666 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerStarted","Data":"2b13c3c0a18cd87e0b80fc8592a3b40af172495726eb4b38a0cc0b9245b2b2e7"} Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.058781 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" podStartSLOduration=12.196164276 podStartE2EDuration="29.058759467s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.667751868 +0000 UTC m=+889.936129893" lastFinishedPulling="2026-01-30 18:45:30.530347059 +0000 UTC m=+906.798725084" observedRunningTime="2026-01-30 18:45:41.048068542 +0000 UTC m=+917.316446567" watchObservedRunningTime="2026-01-30 18:45:41.058759467 +0000 UTC m=+917.327137492" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.064939 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" podStartSLOduration=11.716006782000001 podStartE2EDuration="29.06492569s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.712439455 +0000 UTC m=+889.980817480" lastFinishedPulling="2026-01-30 18:45:31.061358363 +0000 UTC m=+907.329736388" observedRunningTime="2026-01-30 18:45:41.060135361 +0000 UTC m=+917.328513386" watchObservedRunningTime="2026-01-30 18:45:41.06492569 +0000 UTC m=+917.333303715" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.087864 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" podStartSLOduration=12.485333448 podStartE2EDuration="29.087834577s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.927630695 +0000 UTC m=+890.196008720" lastFinishedPulling="2026-01-30 18:45:30.530131824 +0000 UTC m=+906.798509849" observedRunningTime="2026-01-30 18:45:41.076269421 +0000 UTC m=+917.344647446" watchObservedRunningTime="2026-01-30 18:45:41.087834577 +0000 UTC m=+917.356212602" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.106430 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" podStartSLOduration=12.48980194 podStartE2EDuration="29.106412718s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.912118971 +0000 UTC m=+890.180496996" lastFinishedPulling="2026-01-30 18:45:30.528729749 +0000 UTC m=+906.797107774" observedRunningTime="2026-01-30 18:45:41.101591518 +0000 UTC m=+917.369969543" watchObservedRunningTime="2026-01-30 18:45:41.106412718 +0000 UTC m=+917.374790743" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.136386 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" podStartSLOduration=12.405230595 podStartE2EDuration="29.13637121s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.330037923 +0000 UTC m=+890.598415948" lastFinishedPulling="2026-01-30 18:45:31.061178528 +0000 UTC m=+907.329556563" observedRunningTime="2026-01-30 18:45:41.132513524 +0000 UTC m=+917.400891549" watchObservedRunningTime="2026-01-30 18:45:41.13637121 +0000 UTC m=+917.404749235" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.162541 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" podStartSLOduration=11.453368396 podStartE2EDuration="29.162524938s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.351995386 +0000 UTC m=+889.620373411" lastFinishedPulling="2026-01-30 18:45:31.061151928 +0000 UTC m=+907.329529953" observedRunningTime="2026-01-30 18:45:41.157799441 +0000 UTC m=+917.426177466" watchObservedRunningTime="2026-01-30 18:45:41.162524938 +0000 UTC m=+917.430902963" Jan 30 18:45:41 crc kubenswrapper[4782]: I0130 18:45:41.210890 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" podStartSLOduration=12.033149157 podStartE2EDuration="29.210872025s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.352360485 +0000 UTC m=+889.620738510" lastFinishedPulling="2026-01-30 18:45:30.530083353 +0000 UTC m=+906.798461378" observedRunningTime="2026-01-30 18:45:41.206166729 +0000 UTC m=+917.474544754" watchObservedRunningTime="2026-01-30 18:45:41.210872025 +0000 UTC m=+917.479250050" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.068584 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" event={"ID":"097a5bf9-6be5-4d4e-9547-f1318371e9db","Type":"ContainerStarted","Data":"e22c4aabddd0d369013bd05505c0917d82499adeba80818000b21ff9cc4886c2"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.069099 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.069903 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" event={"ID":"f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3","Type":"ContainerStarted","Data":"7f9eabc4c6654bee2c2151a2cdb018292e27d978b25f8bb7a3c0feb865175176"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.070151 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.071334 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" event={"ID":"cd676b0f-9e48-461d-8381-998645228b54","Type":"ContainerStarted","Data":"05e1babc09383ac93589381ab687a0bb67b5dd92e00229009bd7eb843344448e"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.071534 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.072903 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" event={"ID":"52ac09b6-ec41-4ebc-ac18-018794fab085","Type":"ContainerStarted","Data":"f81707a293b2630ae0e77eb2b0a7781cc40c9ea39dc170c0eeb498560f8c1fb3"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.073179 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.074634 4782 generic.go:334] "Generic (PLEG): container finished" podID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerID="fdb9afe4297da9889c80e5bac342005e3790e9d6482e867fc767dfbc7394ce3e" exitCode=0 Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.074695 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerDied","Data":"fdb9afe4297da9889c80e5bac342005e3790e9d6482e867fc767dfbc7394ce3e"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.075981 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" event={"ID":"0301eb58-f901-4952-9f7e-7764c0e67d7f","Type":"ContainerStarted","Data":"00e1da9005f412c208e0663c96f1cde75b52495089c91b5943714642ddc8508e"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.076418 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.078091 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" event={"ID":"8ec19937-0358-40cb-9fc0-de54ba844b62","Type":"ContainerStarted","Data":"8b6f00ab1d9d8f39863d5a427381fb627e512a36df5791a42f0d5f071c2fe57c"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.078767 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.079976 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" event={"ID":"c212e215-248f-4b93-9a70-b352f425648c","Type":"ContainerStarted","Data":"6291c62d841175c936689abedfd21f09ad11ad1b509092513fa0fd563ed68488"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.080189 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.081353 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" event={"ID":"1794e6a9-01aa-43b7-841d-ca7bc24950f8","Type":"ContainerStarted","Data":"fd20468ffdb310094c4aec2c65dd50986b117d260cb3c1eb8fa71b93dafd0967"} Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.094141 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" podStartSLOduration=5.720199821 podStartE2EDuration="31.094122386s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.325929531 +0000 UTC m=+890.594307556" lastFinishedPulling="2026-01-30 18:45:39.699852076 +0000 UTC m=+915.968230121" observedRunningTime="2026-01-30 18:45:43.092243929 +0000 UTC m=+919.360621964" watchObservedRunningTime="2026-01-30 18:45:43.094122386 +0000 UTC m=+919.362500411" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.131682 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" podStartSLOduration=5.107832872 podStartE2EDuration="31.131664296s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.676006472 +0000 UTC m=+889.944384497" lastFinishedPulling="2026-01-30 18:45:39.699837896 +0000 UTC m=+915.968215921" observedRunningTime="2026-01-30 18:45:43.13144208 +0000 UTC m=+919.399820115" watchObservedRunningTime="2026-01-30 18:45:43.131664296 +0000 UTC m=+919.400042321" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.154787 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" podStartSLOduration=3.597524011 podStartE2EDuration="31.154757368s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.462443713 +0000 UTC m=+890.730821738" lastFinishedPulling="2026-01-30 18:45:42.01967707 +0000 UTC m=+918.288055095" observedRunningTime="2026-01-30 18:45:43.150107633 +0000 UTC m=+919.418485658" watchObservedRunningTime="2026-01-30 18:45:43.154757368 +0000 UTC m=+919.423135393" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.169269 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" podStartSLOduration=5.425870009 podStartE2EDuration="31.169242297s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.464654857 +0000 UTC m=+890.733032882" lastFinishedPulling="2026-01-30 18:45:40.208027145 +0000 UTC m=+916.476405170" observedRunningTime="2026-01-30 18:45:43.162081089 +0000 UTC m=+919.430459114" watchObservedRunningTime="2026-01-30 18:45:43.169242297 +0000 UTC m=+919.437620332" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.194793 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" podStartSLOduration=3.188324284 podStartE2EDuration="31.194768869s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.224827657 +0000 UTC m=+890.493205682" lastFinishedPulling="2026-01-30 18:45:42.231272242 +0000 UTC m=+918.499650267" observedRunningTime="2026-01-30 18:45:43.19400895 +0000 UTC m=+919.462386975" watchObservedRunningTime="2026-01-30 18:45:43.194768869 +0000 UTC m=+919.463146894" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.216062 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" podStartSLOduration=5.95390644 podStartE2EDuration="31.216039586s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.331779986 +0000 UTC m=+890.600158011" lastFinishedPulling="2026-01-30 18:45:39.593913132 +0000 UTC m=+915.862291157" observedRunningTime="2026-01-30 18:45:43.211833452 +0000 UTC m=+919.480211477" watchObservedRunningTime="2026-01-30 18:45:43.216039586 +0000 UTC m=+919.484417611" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.233259 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" podStartSLOduration=3.471679463 podStartE2EDuration="31.233240052s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.258085891 +0000 UTC m=+890.526463936" lastFinishedPulling="2026-01-30 18:45:42.01964651 +0000 UTC m=+918.288024525" observedRunningTime="2026-01-30 18:45:43.230078244 +0000 UTC m=+919.498456269" watchObservedRunningTime="2026-01-30 18:45:43.233240052 +0000 UTC m=+919.501618077" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.266219 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" podStartSLOduration=3.491328649 podStartE2EDuration="31.266201558s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.455903241 +0000 UTC m=+890.724281266" lastFinishedPulling="2026-01-30 18:45:42.23077615 +0000 UTC m=+918.499154175" observedRunningTime="2026-01-30 18:45:43.263171163 +0000 UTC m=+919.531549188" watchObservedRunningTime="2026-01-30 18:45:43.266201558 +0000 UTC m=+919.534579583" Jan 30 18:45:43 crc kubenswrapper[4782]: I0130 18:45:43.272274 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.092030 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerStarted","Data":"1d7613b867fe0696c40d3881b4a80e851f8135c43ff28784458d8fccb3d28771"} Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.530000 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.538682 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b27955a-e2c6-43eb-953e-af3d66a687e3-cert\") pod \"infra-operator-controller-manager-79955696d6-fg8bm\" (UID: \"8b27955a-e2c6-43eb-953e-af3d66a687e3\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.724629 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8t2vv" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.731982 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.733600 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.738409 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/acd35126-a27d-4b4c-b56b-04ebd8358c74-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d524dn\" (UID: \"acd35126-a27d-4b4c-b56b-04ebd8358c74\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.983661 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-94zvw" Jan 30 18:45:44 crc kubenswrapper[4782]: I0130 18:45:44.992201 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.032508 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm"] Jan 30 18:45:45 crc kubenswrapper[4782]: W0130 18:45:45.039606 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b27955a_e2c6_43eb_953e_af3d66a687e3.slice/crio-61bfc04d30c2693e237f58673b3fd7e18435e3979e152bd718134ad95cbd95c3 WatchSource:0}: Error finding container 61bfc04d30c2693e237f58673b3fd7e18435e3979e152bd718134ad95cbd95c3: Status 404 returned error can't find the container with id 61bfc04d30c2693e237f58673b3fd7e18435e3979e152bd718134ad95cbd95c3 Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.100967 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" event={"ID":"8b27955a-e2c6-43eb-953e-af3d66a687e3","Type":"ContainerStarted","Data":"61bfc04d30c2693e237f58673b3fd7e18435e3979e152bd718134ad95cbd95c3"} Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.102971 4782 generic.go:334] "Generic (PLEG): container finished" podID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerID="1d7613b867fe0696c40d3881b4a80e851f8135c43ff28784458d8fccb3d28771" exitCode=0 Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.103045 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerDied","Data":"1d7613b867fe0696c40d3881b4a80e851f8135c43ff28784458d8fccb3d28771"} Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.138349 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.138507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.155941 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-metrics-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.155949 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c-webhook-certs\") pod \"openstack-operator-controller-manager-857dcb78d6-4vgqm\" (UID: \"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c\") " pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.198810 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fwsgq" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.208630 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.470149 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn"] Jan 30 18:45:45 crc kubenswrapper[4782]: W0130 18:45:45.486514 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd35126_a27d_4b4c_b56b_04ebd8358c74.slice/crio-97e981e6f1bf04cc30ab3078b8e8589ff378724c575aa8c9efe448c3918dd06e WatchSource:0}: Error finding container 97e981e6f1bf04cc30ab3078b8e8589ff378724c575aa8c9efe448c3918dd06e: Status 404 returned error can't find the container with id 97e981e6f1bf04cc30ab3078b8e8589ff378724c575aa8c9efe448c3918dd06e Jan 30 18:45:45 crc kubenswrapper[4782]: I0130 18:45:45.628401 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm"] Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.115305 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" event={"ID":"acd35126-a27d-4b4c-b56b-04ebd8358c74","Type":"ContainerStarted","Data":"97e981e6f1bf04cc30ab3078b8e8589ff378724c575aa8c9efe448c3918dd06e"} Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.118605 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" event={"ID":"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c","Type":"ContainerStarted","Data":"4090929c3db50e55b55f51eb32eab2e7ab325b732d9345ca771b7dfdfb7bf392"} Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.118665 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" event={"ID":"ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c","Type":"ContainerStarted","Data":"d6264f0caecbda0471813205ebf2c08fcc9b6438879e5bc0c11128ee0a0eafe6"} Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.118687 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.123419 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerStarted","Data":"64a0bc57a0a74312f2c8052a4b460567176807f42160bc82de64152320a572b9"} Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.126733 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" event={"ID":"2ca6290f-bb8e-484d-84bd-d9e66b9f1471","Type":"ContainerStarted","Data":"ab43d540ccdaa4b53852e1235452b9953acd1e331b52dbface0865be56f07784"} Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.126935 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.150352 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" podStartSLOduration=33.150331071 podStartE2EDuration="33.150331071s" podCreationTimestamp="2026-01-30 18:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:45:46.144834055 +0000 UTC m=+922.413212090" watchObservedRunningTime="2026-01-30 18:45:46.150331071 +0000 UTC m=+922.418709106" Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.165423 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-45s4h" podStartSLOduration=19.451069958 podStartE2EDuration="22.165400494s" podCreationTimestamp="2026-01-30 18:45:24 +0000 UTC" firstStartedPulling="2026-01-30 18:45:43.076294764 +0000 UTC m=+919.344672789" lastFinishedPulling="2026-01-30 18:45:45.79062529 +0000 UTC m=+922.059003325" observedRunningTime="2026-01-30 18:45:46.163024795 +0000 UTC m=+922.431402810" watchObservedRunningTime="2026-01-30 18:45:46.165400494 +0000 UTC m=+922.433778519" Jan 30 18:45:46 crc kubenswrapper[4782]: I0130 18:45:46.182297 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" podStartSLOduration=2.951255501 podStartE2EDuration="34.182280032s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:13.723438677 +0000 UTC m=+889.991816702" lastFinishedPulling="2026-01-30 18:45:44.954463198 +0000 UTC m=+921.222841233" observedRunningTime="2026-01-30 18:45:46.180287373 +0000 UTC m=+922.448665398" watchObservedRunningTime="2026-01-30 18:45:46.182280032 +0000 UTC m=+922.450658057" Jan 30 18:45:49 crc kubenswrapper[4782]: I0130 18:45:49.155906 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" event={"ID":"8b27955a-e2c6-43eb-953e-af3d66a687e3","Type":"ContainerStarted","Data":"50cd655c4a32e2fba9fc83a0312f4163b54caf62aedfcbd2129bb9d2638633e1"} Jan 30 18:45:49 crc kubenswrapper[4782]: I0130 18:45:49.157330 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" event={"ID":"acd35126-a27d-4b4c-b56b-04ebd8358c74","Type":"ContainerStarted","Data":"de69f36cb45883767fd662f8412b5020d57e070bb47c5980b1e7f06c24a6b4b2"} Jan 30 18:45:49 crc kubenswrapper[4782]: I0130 18:45:49.157521 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:49 crc kubenswrapper[4782]: I0130 18:45:49.192758 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" podStartSLOduration=33.937036258 podStartE2EDuration="37.192726394s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:45.042711744 +0000 UTC m=+921.311089769" lastFinishedPulling="2026-01-30 18:45:48.29840187 +0000 UTC m=+924.566779905" observedRunningTime="2026-01-30 18:45:49.182433049 +0000 UTC m=+925.450811094" watchObservedRunningTime="2026-01-30 18:45:49.192726394 +0000 UTC m=+925.461104489" Jan 30 18:45:49 crc kubenswrapper[4782]: I0130 18:45:49.231927 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" podStartSLOduration=34.416127004 podStartE2EDuration="37.231905434s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:45.488900836 +0000 UTC m=+921.757278861" lastFinishedPulling="2026-01-30 18:45:48.304679266 +0000 UTC m=+924.573057291" observedRunningTime="2026-01-30 18:45:49.223396663 +0000 UTC m=+925.491774738" watchObservedRunningTime="2026-01-30 18:45:49.231905434 +0000 UTC m=+925.500283469" Jan 30 18:45:50 crc kubenswrapper[4782]: I0130 18:45:50.163836 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.656997 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-fmn9l" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.665187 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-wmvtm" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.793173 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-gdx26" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.849057 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-745gl" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.936867 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-h4kqr" Jan 30 18:45:52 crc kubenswrapper[4782]: I0130 18:45:52.964933 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-4jb22" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.016415 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-txtbj" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.035314 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-v85zd" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.049662 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ffbz8" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.054042 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-94rpc" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.083843 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-45lw5" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.117619 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-8phk5" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.152804 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lqrfz" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.176371 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-vsmt2" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.190686 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" event={"ID":"fcdbdda2-62ba-4df8-9885-78c31d1e6157","Type":"ContainerStarted","Data":"faeb2eb723be91bb8b4143681281f084ece413621b60f2cbf935ae7cdc80ca3e"} Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.206105 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-rmkdk" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.215508 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-khwrr" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.221623 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8jcf8" podStartSLOduration=1.7951575640000001 podStartE2EDuration="40.221604103s" podCreationTimestamp="2026-01-30 18:45:13 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.454900096 +0000 UTC m=+890.723278121" lastFinishedPulling="2026-01-30 18:45:52.881346635 +0000 UTC m=+929.149724660" observedRunningTime="2026-01-30 18:45:53.21257858 +0000 UTC m=+929.480956625" watchObservedRunningTime="2026-01-30 18:45:53.221604103 +0000 UTC m=+929.489982128" Jan 30 18:45:53 crc kubenswrapper[4782]: I0130 18:45:53.279630 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-brw4k" Jan 30 18:45:53 crc kubenswrapper[4782]: E0130 18:45:53.411630 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/watcher-operator:add353f857c04debbf620f926c6c19f4f45c7f75\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podUID="a765979e-db86-4d07-8a0a-96c61d42137c" Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.204976 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" event={"ID":"5a54baf5-b3a2-4417-8caf-8fe321ff5f5f","Type":"ContainerStarted","Data":"5f64e13bf64b50a0e9babea76b8ce17f07eef9ef8b4723912ae3813b49acea2f"} Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.205301 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.228261 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" podStartSLOduration=2.962859769 podStartE2EDuration="42.228202348s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.347792503 +0000 UTC m=+890.616170528" lastFinishedPulling="2026-01-30 18:45:53.613135052 +0000 UTC m=+929.881513107" observedRunningTime="2026-01-30 18:45:54.227372267 +0000 UTC m=+930.495750332" watchObservedRunningTime="2026-01-30 18:45:54.228202348 +0000 UTC m=+930.496580413" Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.742966 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-fg8bm" Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.990960 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:54 crc kubenswrapper[4782]: I0130 18:45:54.991045 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:55 crc kubenswrapper[4782]: I0130 18:45:55.000926 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d524dn" Jan 30 18:45:55 crc kubenswrapper[4782]: I0130 18:45:55.073933 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:55 crc kubenswrapper[4782]: I0130 18:45:55.215113 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-857dcb78d6-4vgqm" Jan 30 18:45:55 crc kubenswrapper[4782]: I0130 18:45:55.267401 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:55 crc kubenswrapper[4782]: I0130 18:45:55.815862 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:45:57 crc kubenswrapper[4782]: I0130 18:45:57.231594 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-45s4h" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="registry-server" containerID="cri-o://64a0bc57a0a74312f2c8052a4b460567176807f42160bc82de64152320a572b9" gracePeriod=2 Jan 30 18:45:58 crc kubenswrapper[4782]: I0130 18:45:58.241552 4782 generic.go:334] "Generic (PLEG): container finished" podID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerID="64a0bc57a0a74312f2c8052a4b460567176807f42160bc82de64152320a572b9" exitCode=0 Jan 30 18:45:58 crc kubenswrapper[4782]: I0130 18:45:58.241596 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerDied","Data":"64a0bc57a0a74312f2c8052a4b460567176807f42160bc82de64152320a572b9"} Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.781264 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.885863 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities\") pod \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.885914 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxsvx\" (UniqueName: \"kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx\") pod \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.885961 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content\") pod \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\" (UID: \"969b9c93-6d7d-4be0-befc-f3b1677b6a96\") " Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.888286 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities" (OuterVolumeSpecName: "utilities") pod "969b9c93-6d7d-4be0-befc-f3b1677b6a96" (UID: "969b9c93-6d7d-4be0-befc-f3b1677b6a96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.893179 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx" (OuterVolumeSpecName: "kube-api-access-qxsvx") pod "969b9c93-6d7d-4be0-befc-f3b1677b6a96" (UID: "969b9c93-6d7d-4be0-befc-f3b1677b6a96"). InnerVolumeSpecName "kube-api-access-qxsvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.929006 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "969b9c93-6d7d-4be0-befc-f3b1677b6a96" (UID: "969b9c93-6d7d-4be0-befc-f3b1677b6a96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.987434 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.987475 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/969b9c93-6d7d-4be0-befc-f3b1677b6a96-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:45:59 crc kubenswrapper[4782]: I0130 18:45:59.987488 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxsvx\" (UniqueName: \"kubernetes.io/projected/969b9c93-6d7d-4be0-befc-f3b1677b6a96-kube-api-access-qxsvx\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.261343 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45s4h" event={"ID":"969b9c93-6d7d-4be0-befc-f3b1677b6a96","Type":"ContainerDied","Data":"2b13c3c0a18cd87e0b80fc8592a3b40af172495726eb4b38a0cc0b9245b2b2e7"} Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.261426 4782 scope.go:117] "RemoveContainer" containerID="64a0bc57a0a74312f2c8052a4b460567176807f42160bc82de64152320a572b9" Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.261371 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45s4h" Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.292516 4782 scope.go:117] "RemoveContainer" containerID="1d7613b867fe0696c40d3881b4a80e851f8135c43ff28784458d8fccb3d28771" Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.307638 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.328679 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-45s4h"] Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.335743 4782 scope.go:117] "RemoveContainer" containerID="fdb9afe4297da9889c80e5bac342005e3790e9d6482e867fc767dfbc7394ce3e" Jan 30 18:46:00 crc kubenswrapper[4782]: I0130 18:46:00.428342 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" path="/var/lib/kubelet/pods/969b9c93-6d7d-4be0-befc-f3b1677b6a96/volumes" Jan 30 18:46:03 crc kubenswrapper[4782]: I0130 18:46:03.101894 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-78nps" Jan 30 18:46:09 crc kubenswrapper[4782]: I0130 18:46:09.340141 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" event={"ID":"a765979e-db86-4d07-8a0a-96c61d42137c","Type":"ContainerStarted","Data":"fd595f01c3a355ba0dcfbd9fb6d7af548d9fd2e9c4c2c16aded8692c927f7538"} Jan 30 18:46:09 crc kubenswrapper[4782]: I0130 18:46:09.340971 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:46:09 crc kubenswrapper[4782]: I0130 18:46:09.362923 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" podStartSLOduration=3.365169544 podStartE2EDuration="57.362901825s" podCreationTimestamp="2026-01-30 18:45:12 +0000 UTC" firstStartedPulling="2026-01-30 18:45:14.463382936 +0000 UTC m=+890.731760961" lastFinishedPulling="2026-01-30 18:46:08.461115217 +0000 UTC m=+944.729493242" observedRunningTime="2026-01-30 18:46:09.358452965 +0000 UTC m=+945.626831000" watchObservedRunningTime="2026-01-30 18:46:09.362901825 +0000 UTC m=+945.631279860" Jan 30 18:46:13 crc kubenswrapper[4782]: I0130 18:46:13.288979 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-78c8444fdd-928lz" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.959380 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:31 crc kubenswrapper[4782]: E0130 18:46:31.960049 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="extract-content" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.960061 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="extract-content" Jan 30 18:46:31 crc kubenswrapper[4782]: E0130 18:46:31.960070 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="registry-server" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.960076 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="registry-server" Jan 30 18:46:31 crc kubenswrapper[4782]: E0130 18:46:31.960100 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="extract-utilities" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.960110 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="extract-utilities" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.960247 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="969b9c93-6d7d-4be0-befc-f3b1677b6a96" containerName="registry-server" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.960962 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.963629 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xd58h" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.963807 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.963959 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.964073 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.969197 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.994157 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8km\" (UniqueName: \"kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:31 crc kubenswrapper[4782]: I0130 18:46:31.994269 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.038486 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.047216 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.051412 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.053385 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.095689 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.096071 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.096103 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.096122 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.096149 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8km\" (UniqueName: \"kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.096937 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.121498 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8km\" (UniqueName: \"kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km\") pod \"dnsmasq-dns-c7f8cb7c-7bswp\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.196905 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.196990 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.197011 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.197964 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.198168 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.215909 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2\") pod \"dnsmasq-dns-756d7955df-5fzhm\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.278570 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.363948 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.618049 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:32 crc kubenswrapper[4782]: I0130 18:46:32.872433 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:32 crc kubenswrapper[4782]: W0130 18:46:32.877345 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d9328a1_1001_414c_9b60_15fe596636e7.slice/crio-db9bb41293e5b5d0d8e8ee5664bb27f09e1a468163a32e94301d380a16a7d5af WatchSource:0}: Error finding container db9bb41293e5b5d0d8e8ee5664bb27f09e1a468163a32e94301d380a16a7d5af: Status 404 returned error can't find the container with id db9bb41293e5b5d0d8e8ee5664bb27f09e1a468163a32e94301d380a16a7d5af Jan 30 18:46:33 crc kubenswrapper[4782]: I0130 18:46:33.551316 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" event={"ID":"0d9328a1-1001-414c-9b60-15fe596636e7","Type":"ContainerStarted","Data":"db9bb41293e5b5d0d8e8ee5664bb27f09e1a468163a32e94301d380a16a7d5af"} Jan 30 18:46:33 crc kubenswrapper[4782]: I0130 18:46:33.554215 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" event={"ID":"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2","Type":"ContainerStarted","Data":"333b888b5b27f87ac4d435a990e881dbd23b47f6f63020d11884d30d2a7760d6"} Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.537678 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.567385 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.568613 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.582892 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.668712 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.668987 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.669122 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr8h4\" (UniqueName: \"kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.771987 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.772071 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.772105 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr8h4\" (UniqueName: \"kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.772875 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.773210 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.784734 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.800721 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr8h4\" (UniqueName: \"kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4\") pod \"dnsmasq-dns-5fc9859b57-zkq8j\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.824534 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.825678 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.831150 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.874051 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.874127 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8484s\" (UniqueName: \"kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.874342 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.888356 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.976350 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.976410 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8484s\" (UniqueName: \"kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.976519 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.977498 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.977548 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:35 crc kubenswrapper[4782]: I0130 18:46:35.993353 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8484s\" (UniqueName: \"kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s\") pod \"dnsmasq-dns-d8689c4df-kggm7\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.126031 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.143604 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.146642 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.147710 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.159167 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.280668 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.280756 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.280785 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkc2w\" (UniqueName: \"kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.382154 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.382324 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.382359 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkc2w\" (UniqueName: \"kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.383039 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.383344 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.398694 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkc2w\" (UniqueName: \"kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w\") pod \"dnsmasq-dns-8666d45c85-h72w2\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.463168 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.697136 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.698532 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.707439 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708144 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708282 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708392 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708538 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mtrhn" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708600 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.708898 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.711819 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788082 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788145 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh762\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788186 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788214 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788261 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788296 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788329 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788361 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788395 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788437 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.788457 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889609 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889674 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889711 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh762\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889744 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889773 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889801 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889834 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889862 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889893 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889933 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.889973 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.890468 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.890740 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.890740 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.890800 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.891308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.892906 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.895379 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.896073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.896786 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.912012 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.913257 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh762\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.916857 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " pod="openstack/rabbitmq-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.965453 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.967563 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.970999 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.970999 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.971030 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.971096 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.972037 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.972200 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.972306 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8grjl" Jan 30 18:46:36 crc kubenswrapper[4782]: I0130 18:46:36.986192 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.034616 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.091999 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092284 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092388 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092572 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgf22\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092654 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092722 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092782 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092863 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.092939 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.093016 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.093080 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194592 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194657 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194688 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194715 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194815 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194838 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194879 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgf22\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194926 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194953 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.194976 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.195485 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.195565 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.196002 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.196343 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.197443 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.197896 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.210956 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.211063 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.211159 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.211591 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.213350 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgf22\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.217459 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.255306 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.258417 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.261949 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.262198 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.262629 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.262789 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.262997 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.263202 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.263453 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-zbd7s" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.278105 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.318519 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421292 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e3b2844-afde-444d-b7ee-cddd8b543bf6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421346 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421366 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e3b2844-afde-444d-b7ee-cddd8b543bf6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421383 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421412 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421605 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421721 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421748 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421765 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj92g\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-kube-api-access-dj92g\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421794 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.421858 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.522981 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523016 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e3b2844-afde-444d-b7ee-cddd8b543bf6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523035 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523062 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523095 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523130 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523158 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523184 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj92g\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-kube-api-access-dj92g\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523217 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523268 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523344 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e3b2844-afde-444d-b7ee-cddd8b543bf6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523383 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.523474 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.524031 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.524205 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.524265 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.524597 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9e3b2844-afde-444d-b7ee-cddd8b543bf6-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.527391 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.528083 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9e3b2844-afde-444d-b7ee-cddd8b543bf6-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.528099 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.539483 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9e3b2844-afde-444d-b7ee-cddd8b543bf6-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.540398 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj92g\" (UniqueName: \"kubernetes.io/projected/9e3b2844-afde-444d-b7ee-cddd8b543bf6-kube-api-access-dj92g\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.551619 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"9e3b2844-afde-444d-b7ee-cddd8b543bf6\") " pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:37 crc kubenswrapper[4782]: I0130 18:46:37.635380 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.702952 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.718433 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.720567 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kfdzl" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.720789 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.721110 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.723140 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.723151 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.726844 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844070 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844120 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-default\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844236 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844265 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844323 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-kolla-config\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844341 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brr7z\" (UniqueName: \"kubernetes.io/projected/070c9056-8c32-47ae-b937-b3e4b2b464e7-kube-api-access-brr7z\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.844364 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946286 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946330 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-default\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946391 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946421 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946446 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-kolla-config\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946475 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brr7z\" (UniqueName: \"kubernetes.io/projected/070c9056-8c32-47ae-b937-b3e4b2b464e7-kube-api-access-brr7z\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946496 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946539 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.946629 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.947430 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-config-data-default\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.947498 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-kolla-config\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.947653 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.948595 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/070c9056-8c32-47ae-b937-b3e4b2b464e7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.961074 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.961142 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070c9056-8c32-47ae-b937-b3e4b2b464e7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.969464 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brr7z\" (UniqueName: \"kubernetes.io/projected/070c9056-8c32-47ae-b937-b3e4b2b464e7-kube-api-access-brr7z\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:38 crc kubenswrapper[4782]: I0130 18:46:38.973310 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"070c9056-8c32-47ae-b937-b3e4b2b464e7\") " pod="openstack/openstack-galera-0" Jan 30 18:46:39 crc kubenswrapper[4782]: I0130 18:46:39.054969 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.140610 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.142691 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.149043 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ftt99" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.149275 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.150597 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.152718 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.156049 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266219 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266330 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77fz\" (UniqueName: \"kubernetes.io/projected/a458f19f-501f-4703-9cfe-d8638418215b-kube-api-access-s77fz\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266360 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266382 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266413 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266538 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266584 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.266603 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a458f19f-501f-4703-9cfe-d8638418215b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.368545 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77fz\" (UniqueName: \"kubernetes.io/projected/a458f19f-501f-4703-9cfe-d8638418215b-kube-api-access-s77fz\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.368822 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.368929 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369050 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369163 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369305 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369425 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369525 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a458f19f-501f-4703-9cfe-d8638418215b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.369705 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.370631 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.373119 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.374088 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a458f19f-501f-4703-9cfe-d8638418215b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.375177 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a458f19f-501f-4703-9cfe-d8638418215b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.376170 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.377044 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458f19f-501f-4703-9cfe-d8638418215b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.396771 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77fz\" (UniqueName: \"kubernetes.io/projected/a458f19f-501f-4703-9cfe-d8638418215b-kube-api-access-s77fz\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.410185 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.411670 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.421221 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8lsxn" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.421453 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.421738 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.430193 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a458f19f-501f-4703-9cfe-d8638418215b\") " pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.433193 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.463958 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.471134 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kolla-config\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.471203 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.471280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hwtv\" (UniqueName: \"kubernetes.io/projected/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kube-api-access-9hwtv\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.471313 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-config-data\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.471362 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.572978 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.573630 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kolla-config\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.573706 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.573749 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hwtv\" (UniqueName: \"kubernetes.io/projected/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kube-api-access-9hwtv\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.573797 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-config-data\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.574622 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kolla-config\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.575073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-config-data\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.579641 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.580254 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.605759 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hwtv\" (UniqueName: \"kubernetes.io/projected/7c3e9bb9-ed43-4499-88c1-2bde956a84b8-kube-api-access-9hwtv\") pod \"memcached-0\" (UID: \"7c3e9bb9-ed43-4499-88c1-2bde956a84b8\") " pod="openstack/memcached-0" Jan 30 18:46:40 crc kubenswrapper[4782]: I0130 18:46:40.816185 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 18:46:41 crc kubenswrapper[4782]: I0130 18:46:41.591846 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.705385 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.706313 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.712337 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-m5c99" Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.718920 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.813421 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfqf\" (UniqueName: \"kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf\") pod \"kube-state-metrics-0\" (UID: \"a89d76b9-7010-4d8b-ac8e-fac56394928d\") " pod="openstack/kube-state-metrics-0" Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.915172 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfqf\" (UniqueName: \"kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf\") pod \"kube-state-metrics-0\" (UID: \"a89d76b9-7010-4d8b-ac8e-fac56394928d\") " pod="openstack/kube-state-metrics-0" Jan 30 18:46:42 crc kubenswrapper[4782]: I0130 18:46:42.964275 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfqf\" (UniqueName: \"kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf\") pod \"kube-state-metrics-0\" (UID: \"a89d76b9-7010-4d8b-ac8e-fac56394928d\") " pod="openstack/kube-state-metrics-0" Jan 30 18:46:43 crc kubenswrapper[4782]: I0130 18:46:43.036494 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.206170 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.208107 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.210867 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.210927 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.210942 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.210997 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.211261 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.211348 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.218107 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.219322 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.221965 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4h9n8" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348358 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348613 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348637 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348666 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348707 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348725 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.348997 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.349077 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.349101 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdmb7\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.349170 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450593 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450665 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450689 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdmb7\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450768 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450819 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450844 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450882 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.450910 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.451356 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.453590 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.454426 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.454542 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.455873 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.456802 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.457945 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.458571 4782 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.458623 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7160aa072a4d7b723e1d7d729bf0a8cb68e388f5eb2d179074e77609eba87da8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.465357 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.469650 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdmb7\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.488480 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:44 crc kubenswrapper[4782]: I0130 18:46:44.529106 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:46:45 crc kubenswrapper[4782]: I0130 18:46:45.994134 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pz2pk"] Jan 30 18:46:45 crc kubenswrapper[4782]: I0130 18:46:45.995149 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:45 crc kubenswrapper[4782]: I0130 18:46:45.997277 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 18:46:45 crc kubenswrapper[4782]: I0130 18:46:45.997554 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-4x4fw" Jan 30 18:46:45 crc kubenswrapper[4782]: I0130 18:46:45.997803 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.022480 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bk7c8"] Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.024860 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.030970 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk"] Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.058009 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bk7c8"] Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075730 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-lib\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075777 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-log\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075799 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-run\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075849 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-ovn-controller-tls-certs\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075938 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075960 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shklc\" (UniqueName: \"kubernetes.io/projected/91d457a1-1878-47f1-a1d3-eac450864978-kube-api-access-shklc\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.075979 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d457a1-1878-47f1-a1d3-eac450864978-scripts\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076015 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvv8z\" (UniqueName: \"kubernetes.io/projected/f3433e7d-6a6b-4f6b-b061-22479d5391f9-kube-api-access-cvv8z\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076044 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076061 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-combined-ca-bundle\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076087 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-etc-ovs\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076102 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3433e7d-6a6b-4f6b-b061-22479d5391f9-scripts\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.076121 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-log-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177303 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177353 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shklc\" (UniqueName: \"kubernetes.io/projected/91d457a1-1878-47f1-a1d3-eac450864978-kube-api-access-shklc\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177375 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d457a1-1878-47f1-a1d3-eac450864978-scripts\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177414 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvv8z\" (UniqueName: \"kubernetes.io/projected/f3433e7d-6a6b-4f6b-b061-22479d5391f9-kube-api-access-cvv8z\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177462 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177483 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-combined-ca-bundle\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177510 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-etc-ovs\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3433e7d-6a6b-4f6b-b061-22479d5391f9-scripts\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177546 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-log-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177581 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-lib\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177600 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-log\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177622 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-run\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.177639 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-ovn-controller-tls-certs\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178198 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-log-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178277 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-lib\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178316 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-etc-ovs\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178361 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-log\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178468 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3433e7d-6a6b-4f6b-b061-22479d5391f9-var-run\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178481 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.178469 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91d457a1-1878-47f1-a1d3-eac450864978-var-run-ovn\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.179709 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91d457a1-1878-47f1-a1d3-eac450864978-scripts\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.180298 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3433e7d-6a6b-4f6b-b061-22479d5391f9-scripts\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.197902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-ovn-controller-tls-certs\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.198059 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d457a1-1878-47f1-a1d3-eac450864978-combined-ca-bundle\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.214320 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvv8z\" (UniqueName: \"kubernetes.io/projected/f3433e7d-6a6b-4f6b-b061-22479d5391f9-kube-api-access-cvv8z\") pod \"ovn-controller-ovs-bk7c8\" (UID: \"f3433e7d-6a6b-4f6b-b061-22479d5391f9\") " pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.216799 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shklc\" (UniqueName: \"kubernetes.io/projected/91d457a1-1878-47f1-a1d3-eac450864978-kube-api-access-shklc\") pod \"ovn-controller-pz2pk\" (UID: \"91d457a1-1878-47f1-a1d3-eac450864978\") " pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.310557 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.320471 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.327487 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.334182 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.334360 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.334487 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vdx5p" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.334621 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.334729 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.341999 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.342190 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380540 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380573 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380588 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380622 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380639 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380661 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq45t\" (UniqueName: \"kubernetes.io/projected/a5e12229-4958-47a9-9210-18fba05c1319-kube-api-access-cq45t\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.380683 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5e12229-4958-47a9-9210-18fba05c1319-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.481901 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.481957 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.481975 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.482007 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.482027 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.482048 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq45t\" (UniqueName: \"kubernetes.io/projected/a5e12229-4958-47a9-9210-18fba05c1319-kube-api-access-cq45t\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.482069 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5e12229-4958-47a9-9210-18fba05c1319-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.482110 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.483150 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.483291 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a5e12229-4958-47a9-9210-18fba05c1319-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.483339 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-config\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.483946 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5e12229-4958-47a9-9210-18fba05c1319-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.486469 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.486497 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.498767 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e12229-4958-47a9-9210-18fba05c1319-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.504306 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.512040 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq45t\" (UniqueName: \"kubernetes.io/projected/a5e12229-4958-47a9-9210-18fba05c1319-kube-api-access-cq45t\") pod \"ovsdbserver-nb-0\" (UID: \"a5e12229-4958-47a9-9210-18fba05c1319\") " pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:46 crc kubenswrapper[4782]: I0130 18:46:46.682781 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 18:46:47 crc kubenswrapper[4782]: W0130 18:46:47.865402 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30ded24a_ee08_4d96_80f1_3d5793ec76bb.slice/crio-7821be7ff6e627cd88e32c30aa83967e8a1a219ccd02591580a69f0aa6d46749 WatchSource:0}: Error finding container 7821be7ff6e627cd88e32c30aa83967e8a1a219ccd02591580a69f0aa6d46749: Status 404 returned error can't find the container with id 7821be7ff6e627cd88e32c30aa83967e8a1a219ccd02591580a69f0aa6d46749 Jan 30 18:46:48 crc kubenswrapper[4782]: I0130 18:46:48.672049 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerStarted","Data":"7821be7ff6e627cd88e32c30aa83967e8a1a219ccd02591580a69f0aa6d46749"} Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.793354 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.793478 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.811371 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.812592 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.816081 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.816294 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.816936 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.817115 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-p89sr" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.842407 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941297 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941382 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941441 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941640 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941705 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941858 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.941938 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:49 crc kubenswrapper[4782]: I0130 18:46:49.942034 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6b6\" (UniqueName: \"kubernetes.io/projected/6123a5a8-5a6d-455c-9418-71d31b35e2f3-kube-api-access-5l6b6\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043627 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043723 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043752 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043813 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043842 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043871 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6b6\" (UniqueName: \"kubernetes.io/projected/6123a5a8-5a6d-455c-9418-71d31b35e2f3-kube-api-access-5l6b6\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043915 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.043948 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.044516 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.045107 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.047542 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-config\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.050388 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.051125 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.054589 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6123a5a8-5a6d-455c-9418-71d31b35e2f3-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.061273 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6123a5a8-5a6d-455c-9418-71d31b35e2f3-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.063996 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6b6\" (UniqueName: \"kubernetes.io/projected/6123a5a8-5a6d-455c-9418-71d31b35e2f3-kube-api-access-5l6b6\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.070113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6123a5a8-5a6d-455c-9418-71d31b35e2f3\") " pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:50 crc kubenswrapper[4782]: I0130 18:46:50.159953 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.836012 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.837067 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.837198 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxtc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-756d7955df-5fzhm_openstack(0d9328a1-1001-414c-9b60-15fe596636e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.838597 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" podUID="0d9328a1-1001-414c-9b60-15fe596636e7" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.849083 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.849141 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.849279 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.5:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zq8km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-c7f8cb7c-7bswp_openstack(bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:46:54 crc kubenswrapper[4782]: E0130 18:46:54.850749 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" podUID="bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2" Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.255376 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.265220 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.414495 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.420802 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.426105 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.625370 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 18:46:55 crc kubenswrapper[4782]: I0130 18:46:55.634256 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:46:56 crc kubenswrapper[4782]: W0130 18:46:56.590979 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ef56512_bf17_45df_9e3d_ff2e97f66252.slice/crio-c52c50b15dfecad3a0fcfbfc767bf94be310f485b841e8b350aa05b73c636151 WatchSource:0}: Error finding container c52c50b15dfecad3a0fcfbfc767bf94be310f485b841e8b350aa05b73c636151: Status 404 returned error can't find the container with id c52c50b15dfecad3a0fcfbfc767bf94be310f485b841e8b350aa05b73c636151 Jan 30 18:46:56 crc kubenswrapper[4782]: W0130 18:46:56.624520 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda458f19f_501f_4703_9cfe_d8638418215b.slice/crio-5bda06698e956160838bfe2b4b38fbe9e46c4496dd6ad6406d97f53c0fde65a3 WatchSource:0}: Error finding container 5bda06698e956160838bfe2b4b38fbe9e46c4496dd6ad6406d97f53c0fde65a3: Status 404 returned error can't find the container with id 5bda06698e956160838bfe2b4b38fbe9e46c4496dd6ad6406d97f53c0fde65a3 Jan 30 18:46:56 crc kubenswrapper[4782]: W0130 18:46:56.645799 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3e9bb9_ed43_4499_88c1_2bde956a84b8.slice/crio-ab8e8a04f28115a623a73c61603ea8db30a8dafef96ab300833aaf750f21000f WatchSource:0}: Error finding container ab8e8a04f28115a623a73c61603ea8db30a8dafef96ab300833aaf750f21000f: Status 404 returned error can't find the container with id ab8e8a04f28115a623a73c61603ea8db30a8dafef96ab300833aaf750f21000f Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.742847 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.778369 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.781901 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" event={"ID":"0d9328a1-1001-414c-9b60-15fe596636e7","Type":"ContainerDied","Data":"db9bb41293e5b5d0d8e8ee5664bb27f09e1a468163a32e94301d380a16a7d5af"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.782148 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-756d7955df-5fzhm" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.808060 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7c3e9bb9-ed43-4499-88c1-2bde956a84b8","Type":"ContainerStarted","Data":"ab8e8a04f28115a623a73c61603ea8db30a8dafef96ab300833aaf750f21000f"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.825625 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a458f19f-501f-4703-9cfe-d8638418215b","Type":"ContainerStarted","Data":"5bda06698e956160838bfe2b4b38fbe9e46c4496dd6ad6406d97f53c0fde65a3"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.828460 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" event={"ID":"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2","Type":"ContainerDied","Data":"333b888b5b27f87ac4d435a990e881dbd23b47f6f63020d11884d30d2a7760d6"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.828489 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c7f8cb7c-7bswp" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.829625 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"070c9056-8c32-47ae-b937-b3e4b2b464e7","Type":"ContainerStarted","Data":"5269a4e895c4e748eb0bf25ba53293ed11c3aae72363119c6e068a9c9dc2d15b"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.832726 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" event={"ID":"5cf13a98-47a3-4b6b-9448-16da2d5a1b08","Type":"ContainerStarted","Data":"e170d65fc9c0af1522be32d77259093be371788664a87ffbf60cfec3e3ed61df"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.834118 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9e3b2844-afde-444d-b7ee-cddd8b543bf6","Type":"ContainerStarted","Data":"7bbdc8ff7611277e3b5db8b563662c5c1326f191706b2d44650b52aea3db265c"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.835265 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerStarted","Data":"c52c50b15dfecad3a0fcfbfc767bf94be310f485b841e8b350aa05b73c636151"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.837775 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" event={"ID":"bbc79520-07bf-4876-a889-9bcd4b3be4c7","Type":"ContainerStarted","Data":"43cfe5e93cbd71c6353cc4b52b6fe796d586dc5bbfdfaf14d995966f55726a47"} Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.905378 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config\") pod \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.905461 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config\") pod \"0d9328a1-1001-414c-9b60-15fe596636e7\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.905504 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc\") pod \"0d9328a1-1001-414c-9b60-15fe596636e7\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.905605 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2\") pod \"0d9328a1-1001-414c-9b60-15fe596636e7\" (UID: \"0d9328a1-1001-414c-9b60-15fe596636e7\") " Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.905639 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8km\" (UniqueName: \"kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km\") pod \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\" (UID: \"bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2\") " Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.906392 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config" (OuterVolumeSpecName: "config") pod "0d9328a1-1001-414c-9b60-15fe596636e7" (UID: "0d9328a1-1001-414c-9b60-15fe596636e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.906516 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.906755 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d9328a1-1001-414c-9b60-15fe596636e7" (UID: "0d9328a1-1001-414c-9b60-15fe596636e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.907373 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config" (OuterVolumeSpecName: "config") pod "bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2" (UID: "bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.910833 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km" (OuterVolumeSpecName: "kube-api-access-zq8km") pod "bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2" (UID: "bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2"). InnerVolumeSpecName "kube-api-access-zq8km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:46:56 crc kubenswrapper[4782]: I0130 18:46:56.912375 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2" (OuterVolumeSpecName: "kube-api-access-qxtc2") pod "0d9328a1-1001-414c-9b60-15fe596636e7" (UID: "0d9328a1-1001-414c-9b60-15fe596636e7"). InnerVolumeSpecName "kube-api-access-qxtc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.007652 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.007693 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d9328a1-1001-414c-9b60-15fe596636e7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.007707 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/0d9328a1-1001-414c-9b60-15fe596636e7-kube-api-access-qxtc2\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.007721 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8km\" (UniqueName: \"kubernetes.io/projected/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2-kube-api-access-zq8km\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.150046 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.162542 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-756d7955df-5fzhm"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.180488 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.185816 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.310590 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.327481 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c7f8cb7c-7bswp"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.333659 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.340434 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.389704 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.453074 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 18:46:57 crc kubenswrapper[4782]: W0130 18:46:57.609656 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e12229_4958_47a9_9210_18fba05c1319.slice/crio-b12d55b446ef448725b437fb4f6c4392b3e70d27e04cb33e65c1c9214173d142 WatchSource:0}: Error finding container b12d55b446ef448725b437fb4f6c4392b3e70d27e04cb33e65c1c9214173d142: Status 404 returned error can't find the container with id b12d55b446ef448725b437fb4f6c4392b3e70d27e04cb33e65c1c9214173d142 Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.845394 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5e12229-4958-47a9-9210-18fba05c1319","Type":"ContainerStarted","Data":"b12d55b446ef448725b437fb4f6c4392b3e70d27e04cb33e65c1c9214173d142"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.846956 4782 generic.go:334] "Generic (PLEG): container finished" podID="5cf13a98-47a3-4b6b-9448-16da2d5a1b08" containerID="471dff1e7d185ca176196137b882ee730ef22c862e617c3119adf89886d722e0" exitCode=0 Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.847000 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" event={"ID":"5cf13a98-47a3-4b6b-9448-16da2d5a1b08","Type":"ContainerDied","Data":"471dff1e7d185ca176196137b882ee730ef22c862e617c3119adf89886d722e0"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.905332 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6123a5a8-5a6d-455c-9418-71d31b35e2f3","Type":"ContainerStarted","Data":"88e3ac8791471f73781bb8c2044c4469a8eaf6f4f8306e500f11e45a99575634"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.907595 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a89d76b9-7010-4d8b-ac8e-fac56394928d","Type":"ContainerStarted","Data":"2f39b6e653540ebf2b4f78792590051ee026fb66fac3ea5c6da5cddf04ef014b"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.909934 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk" event={"ID":"91d457a1-1878-47f1-a1d3-eac450864978","Type":"ContainerStarted","Data":"78c7111b369028118020ecf894f69c69e3c57b227d91163349586e9c7a4b9147"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.911996 4782 generic.go:334] "Generic (PLEG): container finished" podID="42c9a852-0a4d-4134-9646-5111fa049b18" containerID="5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be" exitCode=0 Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.912103 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" event={"ID":"42c9a852-0a4d-4134-9646-5111fa049b18","Type":"ContainerDied","Data":"5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.912138 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" event={"ID":"42c9a852-0a4d-4134-9646-5111fa049b18","Type":"ContainerStarted","Data":"64de1e0a1159fa313acd192aa89d58e59c2d02711fee2ff82a9a63983f5de2ac"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.913327 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerStarted","Data":"370131147ef6e64d8cc01a054b6ba833450a2395f218959eb61e3d72ef39c46d"} Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.915995 4782 generic.go:334] "Generic (PLEG): container finished" podID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerID="fb815805e7de86e70f9a0aa4f3224a66c2e86ecca3d4f8f3027550a456e4dd38" exitCode=0 Jan 30 18:46:57 crc kubenswrapper[4782]: I0130 18:46:57.916045 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" event={"ID":"bbc79520-07bf-4876-a889-9bcd4b3be4c7","Type":"ContainerDied","Data":"fb815805e7de86e70f9a0aa4f3224a66c2e86ecca3d4f8f3027550a456e4dd38"} Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.227212 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bk7c8"] Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.423412 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d9328a1-1001-414c-9b60-15fe596636e7" path="/var/lib/kubelet/pods/0d9328a1-1001-414c-9b60-15fe596636e7/volumes" Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.424172 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2" path="/var/lib/kubelet/pods/bfd2b4ee-9a5b-4306-a65f-529fa33cf0a2/volumes" Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.926818 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9e3b2844-afde-444d-b7ee-cddd8b543bf6","Type":"ContainerStarted","Data":"e5cddf4c75c6fb67ec32f3597e9291de31dcd4a1c3cc5f5854d34e84b38917bf"} Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.928310 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerStarted","Data":"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18"} Jan 30 18:46:58 crc kubenswrapper[4782]: I0130 18:46:58.929717 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerStarted","Data":"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521"} Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.349642 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.384947 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc\") pod \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.385034 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config\") pod \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.385179 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr8h4\" (UniqueName: \"kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4\") pod \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\" (UID: \"5cf13a98-47a3-4b6b-9448-16da2d5a1b08\") " Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.389758 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4" (OuterVolumeSpecName: "kube-api-access-tr8h4") pod "5cf13a98-47a3-4b6b-9448-16da2d5a1b08" (UID: "5cf13a98-47a3-4b6b-9448-16da2d5a1b08"). InnerVolumeSpecName "kube-api-access-tr8h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.401477 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5cf13a98-47a3-4b6b-9448-16da2d5a1b08" (UID: "5cf13a98-47a3-4b6b-9448-16da2d5a1b08"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.405293 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config" (OuterVolumeSpecName: "config") pod "5cf13a98-47a3-4b6b-9448-16da2d5a1b08" (UID: "5cf13a98-47a3-4b6b-9448-16da2d5a1b08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.486705 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr8h4\" (UniqueName: \"kubernetes.io/projected/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-kube-api-access-tr8h4\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.487042 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.487058 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cf13a98-47a3-4b6b-9448-16da2d5a1b08-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.937647 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.937639 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc9859b57-zkq8j" event={"ID":"5cf13a98-47a3-4b6b-9448-16da2d5a1b08","Type":"ContainerDied","Data":"e170d65fc9c0af1522be32d77259093be371788664a87ffbf60cfec3e3ed61df"} Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.937830 4782 scope.go:117] "RemoveContainer" containerID="471dff1e7d185ca176196137b882ee730ef22c862e617c3119adf89886d722e0" Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.939102 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bk7c8" event={"ID":"f3433e7d-6a6b-4f6b-b061-22479d5391f9","Type":"ContainerStarted","Data":"18247fbb51244e8c6f768417bc096b4737908ebcbb7d2f32dbdef877d60ab322"} Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.990437 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:46:59 crc kubenswrapper[4782]: I0130 18:46:59.997436 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fc9859b57-zkq8j"] Jan 30 18:47:00 crc kubenswrapper[4782]: I0130 18:47:00.420361 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf13a98-47a3-4b6b-9448-16da2d5a1b08" path="/var/lib/kubelet/pods/5cf13a98-47a3-4b6b-9448-16da2d5a1b08/volumes" Jan 30 18:47:01 crc kubenswrapper[4782]: I0130 18:47:01.956063 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" event={"ID":"bbc79520-07bf-4876-a889-9bcd4b3be4c7","Type":"ContainerStarted","Data":"7be8a0f509b2b1fd773c6b227fa70a90f67a96dd0e8af262aa73cb429c59845d"} Jan 30 18:47:01 crc kubenswrapper[4782]: I0130 18:47:01.956405 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:47:01 crc kubenswrapper[4782]: I0130 18:47:01.971156 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" podStartSLOduration=25.756977871 podStartE2EDuration="25.971135617s" podCreationTimestamp="2026-01-30 18:46:36 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.624534889 +0000 UTC m=+992.892912924" lastFinishedPulling="2026-01-30 18:46:56.838692645 +0000 UTC m=+993.107070670" observedRunningTime="2026-01-30 18:47:01.970288496 +0000 UTC m=+998.238666521" watchObservedRunningTime="2026-01-30 18:47:01.971135617 +0000 UTC m=+998.239513642" Jan 30 18:47:05 crc kubenswrapper[4782]: I0130 18:47:05.995125 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" event={"ID":"42c9a852-0a4d-4134-9646-5111fa049b18","Type":"ContainerStarted","Data":"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e"} Jan 30 18:47:05 crc kubenswrapper[4782]: I0130 18:47:05.996162 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:47:06 crc kubenswrapper[4782]: I0130 18:47:06.036074 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" podStartSLOduration=31.036044561 podStartE2EDuration="31.036044561s" podCreationTimestamp="2026-01-30 18:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:06.0223064 +0000 UTC m=+1002.290684445" watchObservedRunningTime="2026-01-30 18:47:06.036044561 +0000 UTC m=+1002.304422596" Jan 30 18:47:06 crc kubenswrapper[4782]: I0130 18:47:06.467548 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:47:06 crc kubenswrapper[4782]: I0130 18:47:06.543100 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.016879 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7c3e9bb9-ed43-4499-88c1-2bde956a84b8","Type":"ContainerStarted","Data":"750a0beefcbdfdcff699bc523415a0a0658cb5d11874112b88c9d99cd525867a"} Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.017483 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.018321 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a458f19f-501f-4703-9cfe-d8638418215b","Type":"ContainerStarted","Data":"c3f7ad4e916efca0747754b148003da2deaf25fa37fba2c4f4b5187fcecfedc3"} Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.022448 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="dnsmasq-dns" containerID="cri-o://513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e" gracePeriod=10 Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.023667 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"070c9056-8c32-47ae-b937-b3e4b2b464e7","Type":"ContainerStarted","Data":"c0318ed286c0db67dd7341687540d6b557d6b7e714e19628d84e2c715a6033a6"} Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.083317 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.368246277 podStartE2EDuration="28.083288509s" podCreationTimestamp="2026-01-30 18:46:40 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.665967655 +0000 UTC m=+992.934345680" lastFinishedPulling="2026-01-30 18:47:01.381009887 +0000 UTC m=+997.649387912" observedRunningTime="2026-01-30 18:47:08.038627573 +0000 UTC m=+1004.307005608" watchObservedRunningTime="2026-01-30 18:47:08.083288509 +0000 UTC m=+1004.351666574" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.510184 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.662403 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8484s\" (UniqueName: \"kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s\") pod \"42c9a852-0a4d-4134-9646-5111fa049b18\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.662613 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc\") pod \"42c9a852-0a4d-4134-9646-5111fa049b18\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.662651 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config\") pod \"42c9a852-0a4d-4134-9646-5111fa049b18\" (UID: \"42c9a852-0a4d-4134-9646-5111fa049b18\") " Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.674765 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s" (OuterVolumeSpecName: "kube-api-access-8484s") pod "42c9a852-0a4d-4134-9646-5111fa049b18" (UID: "42c9a852-0a4d-4134-9646-5111fa049b18"). InnerVolumeSpecName "kube-api-access-8484s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.764795 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8484s\" (UniqueName: \"kubernetes.io/projected/42c9a852-0a4d-4134-9646-5111fa049b18-kube-api-access-8484s\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.820326 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config" (OuterVolumeSpecName: "config") pod "42c9a852-0a4d-4134-9646-5111fa049b18" (UID: "42c9a852-0a4d-4134-9646-5111fa049b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.827526 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42c9a852-0a4d-4134-9646-5111fa049b18" (UID: "42c9a852-0a4d-4134-9646-5111fa049b18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.866490 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:08 crc kubenswrapper[4782]: I0130 18:47:08.866538 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42c9a852-0a4d-4134-9646-5111fa049b18-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.031516 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk" event={"ID":"91d457a1-1878-47f1-a1d3-eac450864978","Type":"ContainerStarted","Data":"c2f2fcb5a1b231b619fad9f8bb12cf3a065d2b1dc76d7a941fa68ee41ea16132"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.032809 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-pz2pk" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.034749 4782 generic.go:334] "Generic (PLEG): container finished" podID="42c9a852-0a4d-4134-9646-5111fa049b18" containerID="513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e" exitCode=0 Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.034778 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" event={"ID":"42c9a852-0a4d-4134-9646-5111fa049b18","Type":"ContainerDied","Data":"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.034809 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" event={"ID":"42c9a852-0a4d-4134-9646-5111fa049b18","Type":"ContainerDied","Data":"64de1e0a1159fa313acd192aa89d58e59c2d02711fee2ff82a9a63983f5de2ac"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.034831 4782 scope.go:117] "RemoveContainer" containerID="513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.035001 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d8689c4df-kggm7" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.041033 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5e12229-4958-47a9-9210-18fba05c1319","Type":"ContainerStarted","Data":"c583862e12bc9d689d7fbfee1cbb2bebfd1abf7ef4463cc3be84154346f7fd0d"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.046746 4782 generic.go:334] "Generic (PLEG): container finished" podID="f3433e7d-6a6b-4f6b-b061-22479d5391f9" containerID="e84b9c9fb02170a71857314d4cf9fe98b30e736fd7894ae8790198ab8ec70e23" exitCode=0 Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.046858 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bk7c8" event={"ID":"f3433e7d-6a6b-4f6b-b061-22479d5391f9","Type":"ContainerDied","Data":"e84b9c9fb02170a71857314d4cf9fe98b30e736fd7894ae8790198ab8ec70e23"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.054860 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6123a5a8-5a6d-455c-9418-71d31b35e2f3","Type":"ContainerStarted","Data":"f97ef2cd588b55db2036022703b4ce978778c9c5be85841fd633bb924806b8d0"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.055756 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pz2pk" podStartSLOduration=14.206524746 podStartE2EDuration="24.055743611s" podCreationTimestamp="2026-01-30 18:46:45 +0000 UTC" firstStartedPulling="2026-01-30 18:46:57.38140343 +0000 UTC m=+993.649781455" lastFinishedPulling="2026-01-30 18:47:07.230622255 +0000 UTC m=+1003.499000320" observedRunningTime="2026-01-30 18:47:09.055600258 +0000 UTC m=+1005.323978283" watchObservedRunningTime="2026-01-30 18:47:09.055743611 +0000 UTC m=+1005.324121636" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.060080 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a89d76b9-7010-4d8b-ac8e-fac56394928d","Type":"ContainerStarted","Data":"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7"} Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.079756 4782 scope.go:117] "RemoveContainer" containerID="5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.099164 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.16327353 podStartE2EDuration="27.099143906s" podCreationTimestamp="2026-01-30 18:46:42 +0000 UTC" firstStartedPulling="2026-01-30 18:46:57.228900422 +0000 UTC m=+993.497278447" lastFinishedPulling="2026-01-30 18:47:08.164770798 +0000 UTC m=+1004.433148823" observedRunningTime="2026-01-30 18:47:09.096549242 +0000 UTC m=+1005.364927267" watchObservedRunningTime="2026-01-30 18:47:09.099143906 +0000 UTC m=+1005.367521931" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.128172 4782 scope.go:117] "RemoveContainer" containerID="513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.129006 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:47:09 crc kubenswrapper[4782]: E0130 18:47:09.129910 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e\": container with ID starting with 513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e not found: ID does not exist" containerID="513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.129962 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e"} err="failed to get container status \"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e\": rpc error: code = NotFound desc = could not find container \"513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e\": container with ID starting with 513c7cc9c333da217625a490ce1f5dbb7914d9c2c575ce605892b5d275e3ca6e not found: ID does not exist" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.129994 4782 scope.go:117] "RemoveContainer" containerID="5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be" Jan 30 18:47:09 crc kubenswrapper[4782]: E0130 18:47:09.130337 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be\": container with ID starting with 5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be not found: ID does not exist" containerID="5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.130372 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be"} err="failed to get container status \"5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be\": rpc error: code = NotFound desc = could not find container \"5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be\": container with ID starting with 5333b9ebeeb7ea70af677c97847dc73481137c13261c95c777cc971aa96ef5be not found: ID does not exist" Jan 30 18:47:09 crc kubenswrapper[4782]: I0130 18:47:09.140889 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d8689c4df-kggm7"] Jan 30 18:47:10 crc kubenswrapper[4782]: I0130 18:47:10.076515 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bk7c8" event={"ID":"f3433e7d-6a6b-4f6b-b061-22479d5391f9","Type":"ContainerStarted","Data":"b07158ef21f12b6fc2cb5c370cd250150a50f32c7b9477a20ed194816ba5d304"} Jan 30 18:47:10 crc kubenswrapper[4782]: I0130 18:47:10.077666 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 18:47:10 crc kubenswrapper[4782]: I0130 18:47:10.422929 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" path="/var/lib/kubelet/pods/42c9a852-0a4d-4134-9646-5111fa049b18/volumes" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.089480 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a5e12229-4958-47a9-9210-18fba05c1319","Type":"ContainerStarted","Data":"2ce5a4a333222463bf583836a1b72098f31be0285194e41fabe8329ab41603a0"} Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.099585 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bk7c8" event={"ID":"f3433e7d-6a6b-4f6b-b061-22479d5391f9","Type":"ContainerStarted","Data":"408b4d8f664be32646d35720adb9aa1df7ec575f8ebca5835a8991a8311fffa8"} Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.099792 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.102180 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6123a5a8-5a6d-455c-9418-71d31b35e2f3","Type":"ContainerStarted","Data":"b6e2ac39354d7b8d5048102c651135245279803b3e967ff2074a5cfa70ce218a"} Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.104099 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerStarted","Data":"e8e79982a7d959454bc43b52f926e3a06b057e89684c1a809e35d6e8fa448449"} Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.124906 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.588986197 podStartE2EDuration="26.124882812s" podCreationTimestamp="2026-01-30 18:46:45 +0000 UTC" firstStartedPulling="2026-01-30 18:46:57.61227405 +0000 UTC m=+993.880652095" lastFinishedPulling="2026-01-30 18:47:10.148170675 +0000 UTC m=+1006.416548710" observedRunningTime="2026-01-30 18:47:11.114919065 +0000 UTC m=+1007.383297120" watchObservedRunningTime="2026-01-30 18:47:11.124882812 +0000 UTC m=+1007.393260877" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.157063 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.388246903 podStartE2EDuration="23.157040078s" podCreationTimestamp="2026-01-30 18:46:48 +0000 UTC" firstStartedPulling="2026-01-30 18:46:57.376754265 +0000 UTC m=+993.645132300" lastFinishedPulling="2026-01-30 18:47:10.14554744 +0000 UTC m=+1006.413925475" observedRunningTime="2026-01-30 18:47:11.148125508 +0000 UTC m=+1007.416503563" watchObservedRunningTime="2026-01-30 18:47:11.157040078 +0000 UTC m=+1007.425418143" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.160672 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.209740 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bk7c8" podStartSLOduration=18.399146366 podStartE2EDuration="26.209720014s" podCreationTimestamp="2026-01-30 18:46:45 +0000 UTC" firstStartedPulling="2026-01-30 18:46:59.259780536 +0000 UTC m=+995.528158571" lastFinishedPulling="2026-01-30 18:47:07.070354184 +0000 UTC m=+1003.338732219" observedRunningTime="2026-01-30 18:47:11.202047283 +0000 UTC m=+1007.470425348" watchObservedRunningTime="2026-01-30 18:47:11.209720014 +0000 UTC m=+1007.478098039" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.215659 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.344373 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.683730 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.847902 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hcgcc"] Jan 30 18:47:11 crc kubenswrapper[4782]: E0130 18:47:11.848317 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="init" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.848331 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="init" Jan 30 18:47:11 crc kubenswrapper[4782]: E0130 18:47:11.848340 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="dnsmasq-dns" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.848346 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="dnsmasq-dns" Jan 30 18:47:11 crc kubenswrapper[4782]: E0130 18:47:11.848376 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf13a98-47a3-4b6b-9448-16da2d5a1b08" containerName="init" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.848383 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf13a98-47a3-4b6b-9448-16da2d5a1b08" containerName="init" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.848538 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c9a852-0a4d-4134-9646-5111fa049b18" containerName="dnsmasq-dns" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.848554 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf13a98-47a3-4b6b-9448-16da2d5a1b08" containerName="init" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.849447 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.853767 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.864290 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hcgcc"] Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.915906 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.915975 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovn-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.916046 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.916064 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7777\" (UniqueName: \"kubernetes.io/projected/ed7f80e9-b13c-461c-b115-55b8ce9662dc-kube-api-access-t7777\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.916143 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovs-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.916184 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7f80e9-b13c-461c-b115-55b8ce9662dc-config\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.990975 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-576d7447bf-4hg9v"] Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.992940 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:11 crc kubenswrapper[4782]: I0130 18:47:11.997184 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-576d7447bf-4hg9v"] Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.012725 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018128 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018169 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovn-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018219 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7777\" (UniqueName: \"kubernetes.io/projected/ed7f80e9-b13c-461c-b115-55b8ce9662dc-kube-api-access-t7777\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018275 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018326 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovs-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018353 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7f80e9-b13c-461c-b115-55b8ce9662dc-config\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018459 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovn-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.018697 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ed7f80e9-b13c-461c-b115-55b8ce9662dc-ovs-rundir\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.019347 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7f80e9-b13c-461c-b115-55b8ce9662dc-config\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.026936 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-combined-ca-bundle\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.056468 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed7f80e9-b13c-461c-b115-55b8ce9662dc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.062078 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7777\" (UniqueName: \"kubernetes.io/projected/ed7f80e9-b13c-461c-b115-55b8ce9662dc-kube-api-access-t7777\") pod \"ovn-controller-metrics-hcgcc\" (UID: \"ed7f80e9-b13c-461c-b115-55b8ce9662dc\") " pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.113958 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.120255 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsdcr\" (UniqueName: \"kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.120300 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.120341 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.120597 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.134262 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576d7447bf-4hg9v"] Jan 30 18:47:12 crc kubenswrapper[4782]: E0130 18:47:12.134936 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-tsdcr ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" podUID="af953862-5a6d-417e-a7cb-7b5ef8283ec1" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.161576 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.162950 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.165161 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.172481 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.180540 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hcgcc" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222340 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222397 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222535 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222828 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsdcr\" (UniqueName: \"kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222893 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.222911 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmnw\" (UniqueName: \"kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.223067 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.223151 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.223215 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.224640 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.224934 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.225046 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.250581 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsdcr\" (UniqueName: \"kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr\") pod \"dnsmasq-dns-576d7447bf-4hg9v\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.327270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.327353 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.327443 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.327530 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.327590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmnw\" (UniqueName: \"kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.328520 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.329182 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.330646 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.328747 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.348130 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmnw\" (UniqueName: \"kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw\") pod \"dnsmasq-dns-56f5b65c6f-ms4px\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.490031 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.679159 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hcgcc"] Jan 30 18:47:12 crc kubenswrapper[4782]: W0130 18:47:12.965203 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a28a7e1_572a_488d_a38f_3fe4dacf6b4f.slice/crio-6db36d287ce44ef8e468d69b9d2df5aa08f04740847ff02590055e100bae3fb0 WatchSource:0}: Error finding container 6db36d287ce44ef8e468d69b9d2df5aa08f04740847ff02590055e100bae3fb0: Status 404 returned error can't find the container with id 6db36d287ce44ef8e468d69b9d2df5aa08f04740847ff02590055e100bae3fb0 Jan 30 18:47:12 crc kubenswrapper[4782]: I0130 18:47:12.967336 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.045833 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.126623 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" event={"ID":"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f","Type":"ContainerStarted","Data":"6db36d287ce44ef8e468d69b9d2df5aa08f04740847ff02590055e100bae3fb0"} Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.128129 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.128391 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgcc" event={"ID":"ed7f80e9-b13c-461c-b115-55b8ce9662dc","Type":"ContainerStarted","Data":"8f62dbb287b5d795d39655fcd7ba2a323358e0e5fca14084d0a5db8ccee1ac6c"} Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.128464 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hcgcc" event={"ID":"ed7f80e9-b13c-461c-b115-55b8ce9662dc","Type":"ContainerStarted","Data":"4c00d290f2fb29aae775b46ebb3eb68d70837c078c979f5848c433d98ced9d7f"} Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.145472 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.191567 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.209655 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hcgcc" podStartSLOduration=2.20963488 podStartE2EDuration="2.20963488s" podCreationTimestamp="2026-01-30 18:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:13.158385661 +0000 UTC m=+1009.426763696" watchObservedRunningTime="2026-01-30 18:47:13.20963488 +0000 UTC m=+1009.478012905" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.243497 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config\") pod \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.243604 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb\") pod \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.243887 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsdcr\" (UniqueName: \"kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr\") pod \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.243916 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc\") pod \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\" (UID: \"af953862-5a6d-417e-a7cb-7b5ef8283ec1\") " Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.244976 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af953862-5a6d-417e-a7cb-7b5ef8283ec1" (UID: "af953862-5a6d-417e-a7cb-7b5ef8283ec1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.245402 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config" (OuterVolumeSpecName: "config") pod "af953862-5a6d-417e-a7cb-7b5ef8283ec1" (UID: "af953862-5a6d-417e-a7cb-7b5ef8283ec1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.246280 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af953862-5a6d-417e-a7cb-7b5ef8283ec1" (UID: "af953862-5a6d-417e-a7cb-7b5ef8283ec1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.249380 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr" (OuterVolumeSpecName: "kube-api-access-tsdcr") pod "af953862-5a6d-417e-a7cb-7b5ef8283ec1" (UID: "af953862-5a6d-417e-a7cb-7b5ef8283ec1"). InnerVolumeSpecName "kube-api-access-tsdcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.346143 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsdcr\" (UniqueName: \"kubernetes.io/projected/af953862-5a6d-417e-a7cb-7b5ef8283ec1-kube-api-access-tsdcr\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.346178 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.346189 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.346199 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af953862-5a6d-417e-a7cb-7b5ef8283ec1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.683722 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 18:47:13 crc kubenswrapper[4782]: I0130 18:47:13.738523 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.137637 4782 generic.go:334] "Generic (PLEG): container finished" podID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerID="826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06" exitCode=0 Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.137685 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" event={"ID":"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f","Type":"ContainerDied","Data":"826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06"} Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.137715 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576d7447bf-4hg9v" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.214631 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.238992 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576d7447bf-4hg9v"] Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.247479 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-576d7447bf-4hg9v"] Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.369082 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.370794 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.374922 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zgxg2" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.375135 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.375316 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.375469 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.389681 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.431332 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af953862-5a6d-417e-a7cb-7b5ef8283ec1" path="/var/lib/kubelet/pods/af953862-5a6d-417e-a7cb-7b5ef8283ec1/volumes" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.471888 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472112 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472130 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472185 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472218 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf8br\" (UniqueName: \"kubernetes.io/projected/547e7f64-963a-48bd-afa5-e908a3a716a2-kube-api-access-wf8br\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-config\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.472307 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-scripts\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573763 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-config\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573827 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-scripts\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573873 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573897 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573915 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573966 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.573982 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf8br\" (UniqueName: \"kubernetes.io/projected/547e7f64-963a-48bd-afa5-e908a3a716a2-kube-api-access-wf8br\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.609619 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.609771 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-config\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.610139 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/547e7f64-963a-48bd-afa5-e908a3a716a2-scripts\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.610415 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf8br\" (UniqueName: \"kubernetes.io/projected/547e7f64-963a-48bd-afa5-e908a3a716a2-kube-api-access-wf8br\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.610598 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.610747 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.610929 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/547e7f64-963a-48bd-afa5-e908a3a716a2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"547e7f64-963a-48bd-afa5-e908a3a716a2\") " pod="openstack/ovn-northd-0" Jan 30 18:47:14 crc kubenswrapper[4782]: I0130 18:47:14.695397 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 18:47:15 crc kubenswrapper[4782]: I0130 18:47:15.136332 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 18:47:15 crc kubenswrapper[4782]: W0130 18:47:15.148472 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod547e7f64_963a_48bd_afa5_e908a3a716a2.slice/crio-005fa8a8cfa844e3788bb5a6ce34374c53b48244a9f655a20887948c3e5a7c14 WatchSource:0}: Error finding container 005fa8a8cfa844e3788bb5a6ce34374c53b48244a9f655a20887948c3e5a7c14: Status 404 returned error can't find the container with id 005fa8a8cfa844e3788bb5a6ce34374c53b48244a9f655a20887948c3e5a7c14 Jan 30 18:47:15 crc kubenswrapper[4782]: I0130 18:47:15.150498 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" event={"ID":"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f","Type":"ContainerStarted","Data":"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d"} Jan 30 18:47:15 crc kubenswrapper[4782]: I0130 18:47:15.178763 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" podStartSLOduration=3.178741733 podStartE2EDuration="3.178741733s" podCreationTimestamp="2026-01-30 18:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:15.17256313 +0000 UTC m=+1011.440941165" watchObservedRunningTime="2026-01-30 18:47:15.178741733 +0000 UTC m=+1011.447119798" Jan 30 18:47:15 crc kubenswrapper[4782]: I0130 18:47:15.817989 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.163683 4782 generic.go:334] "Generic (PLEG): container finished" podID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerID="e8e79982a7d959454bc43b52f926e3a06b057e89684c1a809e35d6e8fa448449" exitCode=0 Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.163740 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerDied","Data":"e8e79982a7d959454bc43b52f926e3a06b057e89684c1a809e35d6e8fa448449"} Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.167806 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"547e7f64-963a-48bd-afa5-e908a3a716a2","Type":"ContainerStarted","Data":"f12fe37bd186dc6aac1882fd472b852a04f132160f105a1bde60042fc2259127"} Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.167848 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"547e7f64-963a-48bd-afa5-e908a3a716a2","Type":"ContainerStarted","Data":"fb3c2992aee51477789b1a51977f62d07f7dd333191cc8b84a756731208fdef8"} Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.167864 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"547e7f64-963a-48bd-afa5-e908a3a716a2","Type":"ContainerStarted","Data":"005fa8a8cfa844e3788bb5a6ce34374c53b48244a9f655a20887948c3e5a7c14"} Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.168139 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.168410 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 18:47:16 crc kubenswrapper[4782]: I0130 18:47:16.231601 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.613416411 podStartE2EDuration="2.231583866s" podCreationTimestamp="2026-01-30 18:47:14 +0000 UTC" firstStartedPulling="2026-01-30 18:47:15.159503036 +0000 UTC m=+1011.427881081" lastFinishedPulling="2026-01-30 18:47:15.777670511 +0000 UTC m=+1012.046048536" observedRunningTime="2026-01-30 18:47:16.229957236 +0000 UTC m=+1012.498335271" watchObservedRunningTime="2026-01-30 18:47:16.231583866 +0000 UTC m=+1012.499961901" Jan 30 18:47:18 crc kubenswrapper[4782]: I0130 18:47:18.191705 4782 generic.go:334] "Generic (PLEG): container finished" podID="a458f19f-501f-4703-9cfe-d8638418215b" containerID="c3f7ad4e916efca0747754b148003da2deaf25fa37fba2c4f4b5187fcecfedc3" exitCode=0 Jan 30 18:47:18 crc kubenswrapper[4782]: I0130 18:47:18.191818 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a458f19f-501f-4703-9cfe-d8638418215b","Type":"ContainerDied","Data":"c3f7ad4e916efca0747754b148003da2deaf25fa37fba2c4f4b5187fcecfedc3"} Jan 30 18:47:18 crc kubenswrapper[4782]: I0130 18:47:18.198096 4782 generic.go:334] "Generic (PLEG): container finished" podID="070c9056-8c32-47ae-b937-b3e4b2b464e7" containerID="c0318ed286c0db67dd7341687540d6b557d6b7e714e19628d84e2c715a6033a6" exitCode=0 Jan 30 18:47:18 crc kubenswrapper[4782]: I0130 18:47:18.198124 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"070c9056-8c32-47ae-b937-b3e4b2b464e7","Type":"ContainerDied","Data":"c0318ed286c0db67dd7341687540d6b557d6b7e714e19628d84e2c715a6033a6"} Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.210174 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a458f19f-501f-4703-9cfe-d8638418215b","Type":"ContainerStarted","Data":"2b488ef06c851746902b0766af359514197d4d2ad70114161b3ec7b896671326"} Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.212564 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"070c9056-8c32-47ae-b937-b3e4b2b464e7","Type":"ContainerStarted","Data":"390586523bc4a1bd81da7683be042b77f3fbcce45951936572dc8311c5503bd1"} Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.238574 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.328155509 podStartE2EDuration="40.23855547s" podCreationTimestamp="2026-01-30 18:46:39 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.635811278 +0000 UTC m=+992.904189333" lastFinishedPulling="2026-01-30 18:47:01.546211269 +0000 UTC m=+997.814589294" observedRunningTime="2026-01-30 18:47:19.23731406 +0000 UTC m=+1015.505692115" watchObservedRunningTime="2026-01-30 18:47:19.23855547 +0000 UTC m=+1015.506933505" Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.265076 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=37.503765068999996 podStartE2EDuration="42.265058197s" podCreationTimestamp="2026-01-30 18:46:37 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.631538862 +0000 UTC m=+992.899916917" lastFinishedPulling="2026-01-30 18:47:01.39283202 +0000 UTC m=+997.661210045" observedRunningTime="2026-01-30 18:47:19.264318349 +0000 UTC m=+1015.532696384" watchObservedRunningTime="2026-01-30 18:47:19.265058197 +0000 UTC m=+1015.533436222" Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.793192 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:47:19 crc kubenswrapper[4782]: I0130 18:47:19.793403 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:47:20 crc kubenswrapper[4782]: E0130 18:47:20.367642 4782 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:36984->38.102.83.212:36463: write tcp 38.102.83.212:36984->38.102.83.212:36463: write: broken pipe Jan 30 18:47:20 crc kubenswrapper[4782]: I0130 18:47:20.466775 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 18:47:20 crc kubenswrapper[4782]: I0130 18:47:20.466818 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 18:47:22 crc kubenswrapper[4782]: I0130 18:47:22.491430 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:22 crc kubenswrapper[4782]: I0130 18:47:22.541988 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:47:22 crc kubenswrapper[4782]: I0130 18:47:22.542265 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" containerID="cri-o://7be8a0f509b2b1fd773c6b227fa70a90f67a96dd0e8af262aa73cb429c59845d" gracePeriod=10 Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.230591 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.240257 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.246372 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.262384 4782 generic.go:334] "Generic (PLEG): container finished" podID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerID="7be8a0f509b2b1fd773c6b227fa70a90f67a96dd0e8af262aa73cb429c59845d" exitCode=0 Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.262420 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" event={"ID":"bbc79520-07bf-4876-a889-9bcd4b3be4c7","Type":"ContainerDied","Data":"7be8a0f509b2b1fd773c6b227fa70a90f67a96dd0e8af262aa73cb429c59845d"} Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.355855 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.355904 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.355952 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.356029 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.356374 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r4f5\" (UniqueName: \"kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.457789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r4f5\" (UniqueName: \"kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.457868 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.457888 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.457955 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.458794 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.458725 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.458739 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.459203 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.459399 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.477281 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r4f5\" (UniqueName: \"kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5\") pod \"dnsmasq-dns-84ddc495b5-zgrm6\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:23 crc kubenswrapper[4782]: I0130 18:47:23.563278 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.390254 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.397480 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.400916 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.400913 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.401001 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.401031 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-7zdpv" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.461114 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.580325 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.580993 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ef1358-db2b-4935-b53c-7aad2613cee7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.581170 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llvw9\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-kube-api-access-llvw9\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.581275 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-lock\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.581440 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-cache\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.581516 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.682816 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llvw9\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-kube-api-access-llvw9\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.682872 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-lock\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.682903 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-cache\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.682924 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.682993 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.683014 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ef1358-db2b-4935-b53c-7aad2613cee7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.683361 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.683383 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-cache\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.683633 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15ef1358-db2b-4935-b53c-7aad2613cee7-lock\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.688585 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.691607 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ef1358-db2b-4935-b53c-7aad2613cee7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.694770 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.705187 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.707848 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: I0130 18:47:24.708628 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llvw9\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-kube-api-access-llvw9\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:24 crc kubenswrapper[4782]: E0130 18:47:24.714653 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:24 crc kubenswrapper[4782]: E0130 18:47:24.714684 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:24 crc kubenswrapper[4782]: E0130 18:47:24.714741 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:25.214720177 +0000 UTC m=+1021.483098192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:25 crc kubenswrapper[4782]: I0130 18:47:25.293668 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:25 crc kubenswrapper[4782]: E0130 18:47:25.293854 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:25 crc kubenswrapper[4782]: E0130 18:47:25.293877 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:25 crc kubenswrapper[4782]: E0130 18:47:25.293937 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:26.293918396 +0000 UTC m=+1022.562296421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:26 crc kubenswrapper[4782]: I0130 18:47:26.310033 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:26 crc kubenswrapper[4782]: E0130 18:47:26.310435 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:26 crc kubenswrapper[4782]: E0130 18:47:26.310461 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:26 crc kubenswrapper[4782]: E0130 18:47:26.310545 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:28.310514272 +0000 UTC m=+1024.578892307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:26 crc kubenswrapper[4782]: I0130 18:47:26.464859 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: connect: connection refused" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.317926 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mbdqv"] Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.320059 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.322972 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.323003 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.323099 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.328998 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mbdqv"] Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.344954 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:28 crc kubenswrapper[4782]: E0130 18:47:28.345328 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:28 crc kubenswrapper[4782]: E0130 18:47:28.345365 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:28 crc kubenswrapper[4782]: E0130 18:47:28.345444 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:32.345416784 +0000 UTC m=+1028.613794829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.446597 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.446671 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.446775 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmfl\" (UniqueName: \"kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.446932 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.447095 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.447150 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.447192 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.548787 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.548868 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.548918 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmfl\" (UniqueName: \"kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.548982 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.549080 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.549112 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.549136 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.549979 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.550033 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.550874 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.557790 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.557894 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.565024 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.570507 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmfl\" (UniqueName: \"kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl\") pod \"swift-ring-rebalance-mbdqv\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.643643 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-7zdpv" Jan 30 18:47:28 crc kubenswrapper[4782]: I0130 18:47:28.652320 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:29 crc kubenswrapper[4782]: I0130 18:47:29.056686 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 18:47:29 crc kubenswrapper[4782]: I0130 18:47:29.056758 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 18:47:29 crc kubenswrapper[4782]: I0130 18:47:29.644022 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 18:47:29 crc kubenswrapper[4782]: I0130 18:47:29.788959 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="070c9056-8c32-47ae-b937-b3e4b2b464e7" containerName="galera" probeResult="failure" output=< Jan 30 18:47:29 crc kubenswrapper[4782]: wsrep_local_state_comment (Joined) differs from Synced Jan 30 18:47:29 crc kubenswrapper[4782]: > Jan 30 18:47:30 crc kubenswrapper[4782]: E0130 18:47:30.310150 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e3b2844_afde_444d_b7ee_cddd8b543bf6.slice/crio-e5cddf4c75c6fb67ec32f3597e9291de31dcd4a1c3cc5f5854d34e84b38917bf.scope\": RecentStats: unable to find data in memory cache]" Jan 30 18:47:30 crc kubenswrapper[4782]: I0130 18:47:30.327954 4782 generic.go:334] "Generic (PLEG): container finished" podID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerID="4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521" exitCode=0 Jan 30 18:47:30 crc kubenswrapper[4782]: I0130 18:47:30.328012 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerDied","Data":"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521"} Jan 30 18:47:30 crc kubenswrapper[4782]: I0130 18:47:30.337984 4782 generic.go:334] "Generic (PLEG): container finished" podID="9e3b2844-afde-444d-b7ee-cddd8b543bf6" containerID="e5cddf4c75c6fb67ec32f3597e9291de31dcd4a1c3cc5f5854d34e84b38917bf" exitCode=0 Jan 30 18:47:30 crc kubenswrapper[4782]: I0130 18:47:30.338043 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9e3b2844-afde-444d-b7ee-cddd8b543bf6","Type":"ContainerDied","Data":"e5cddf4c75c6fb67ec32f3597e9291de31dcd4a1c3cc5f5854d34e84b38917bf"} Jan 30 18:47:31 crc kubenswrapper[4782]: I0130 18:47:31.350310 4782 generic.go:334] "Generic (PLEG): container finished" podID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerID="f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18" exitCode=0 Jan 30 18:47:31 crc kubenswrapper[4782]: I0130 18:47:31.350451 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerDied","Data":"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18"} Jan 30 18:47:31 crc kubenswrapper[4782]: I0130 18:47:31.464633 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: connect: connection refused" Jan 30 18:47:32 crc kubenswrapper[4782]: I0130 18:47:32.414844 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:32 crc kubenswrapper[4782]: E0130 18:47:32.415071 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:32 crc kubenswrapper[4782]: E0130 18:47:32.415088 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:32 crc kubenswrapper[4782]: E0130 18:47:32.415131 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:40.415114698 +0000 UTC m=+1036.683492723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:33 crc kubenswrapper[4782]: E0130 18:47:33.526479 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741" Jan 30 18:47:33 crc kubenswrapper[4782]: E0130 18:47:33.526823 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.enable-remote-write-receiver --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:true,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cdmb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.607008 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.805359 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.885814 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.962630 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkc2w\" (UniqueName: \"kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w\") pod \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.962713 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc\") pod \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.962751 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config\") pod \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\" (UID: \"bbc79520-07bf-4876-a889-9bcd4b3be4c7\") " Jan 30 18:47:33 crc kubenswrapper[4782]: I0130 18:47:33.975719 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w" (OuterVolumeSpecName: "kube-api-access-jkc2w") pod "bbc79520-07bf-4876-a889-9bcd4b3be4c7" (UID: "bbc79520-07bf-4876-a889-9bcd4b3be4c7"). InnerVolumeSpecName "kube-api-access-jkc2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.014844 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbc79520-07bf-4876-a889-9bcd4b3be4c7" (UID: "bbc79520-07bf-4876-a889-9bcd4b3be4c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.014855 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config" (OuterVolumeSpecName: "config") pod "bbc79520-07bf-4876-a889-9bcd4b3be4c7" (UID: "bbc79520-07bf-4876-a889-9bcd4b3be4c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.043567 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mbdqv"] Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.065057 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkc2w\" (UniqueName: \"kubernetes.io/projected/bbc79520-07bf-4876-a889-9bcd4b3be4c7-kube-api-access-jkc2w\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.065088 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.065097 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbc79520-07bf-4876-a889-9bcd4b3be4c7-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.176961 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:47:34 crc kubenswrapper[4782]: W0130 18:47:34.183041 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87763cd9_7b99_4b9a_8e0f_02ea849a6b56.slice/crio-69de1d0db2a5cd41e7321dcadb6d495c25ab9bfc48f453ac6d03e314608d82f9 WatchSource:0}: Error finding container 69de1d0db2a5cd41e7321dcadb6d495c25ab9bfc48f453ac6d03e314608d82f9: Status 404 returned error can't find the container with id 69de1d0db2a5cd41e7321dcadb6d495c25ab9bfc48f453ac6d03e314608d82f9 Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.379352 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerStarted","Data":"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.379591 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.381321 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerStarted","Data":"4cc769be6d390810ecd41ff54dfb748892cb5b5e4147d5bbde48823235e1945a"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.381371 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerStarted","Data":"69de1d0db2a5cd41e7321dcadb6d495c25ab9bfc48f453ac6d03e314608d82f9"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.383823 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerStarted","Data":"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.384052 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.390871 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" event={"ID":"bbc79520-07bf-4876-a889-9bcd4b3be4c7","Type":"ContainerDied","Data":"43cfe5e93cbd71c6353cc4b52b6fe796d586dc5bbfdfaf14d995966f55726a47"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.390928 4782 scope.go:117] "RemoveContainer" containerID="7be8a0f509b2b1fd773c6b227fa70a90f67a96dd0e8af262aa73cb429c59845d" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.390999 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8666d45c85-h72w2" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.392737 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbdqv" event={"ID":"8a9f5c0e-8d43-437d-b47e-e72f03df077b","Type":"ContainerStarted","Data":"2e7f31a7fb305d28537d309ff34371904703cb4b1bfacf448cf2850a8686c79c"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.395624 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"9e3b2844-afde-444d-b7ee-cddd8b543bf6","Type":"ContainerStarted","Data":"70fe199a03cf6b2b971c85be030bb48fa205ab3ff9c59a1475f18dc632be5e3b"} Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.396047 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.415555 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=59.129218213 podStartE2EDuration="59.415528417s" podCreationTimestamp="2026-01-30 18:46:35 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.59513246 +0000 UTC m=+992.863510525" lastFinishedPulling="2026-01-30 18:46:56.881442704 +0000 UTC m=+993.149820729" observedRunningTime="2026-01-30 18:47:34.400953125 +0000 UTC m=+1030.669331150" watchObservedRunningTime="2026-01-30 18:47:34.415528417 +0000 UTC m=+1030.683906432" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.429239 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.443563894 podStartE2EDuration="59.429208135s" podCreationTimestamp="2026-01-30 18:46:35 +0000 UTC" firstStartedPulling="2026-01-30 18:46:47.8686463 +0000 UTC m=+984.137024325" lastFinishedPulling="2026-01-30 18:46:56.854290541 +0000 UTC m=+993.122668566" observedRunningTime="2026-01-30 18:47:34.426925189 +0000 UTC m=+1030.695303214" watchObservedRunningTime="2026-01-30 18:47:34.429208135 +0000 UTC m=+1030.697586160" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.467143 4782 scope.go:117] "RemoveContainer" containerID="fb815805e7de86e70f9a0aa4f3224a66c2e86ecca3d4f8f3027550a456e4dd38" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.497619 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=58.211921892 podStartE2EDuration="58.49759313s" podCreationTimestamp="2026-01-30 18:46:36 +0000 UTC" firstStartedPulling="2026-01-30 18:46:56.590410413 +0000 UTC m=+992.858788478" lastFinishedPulling="2026-01-30 18:46:56.876081691 +0000 UTC m=+993.144459716" observedRunningTime="2026-01-30 18:47:34.454940393 +0000 UTC m=+1030.723318418" watchObservedRunningTime="2026-01-30 18:47:34.49759313 +0000 UTC m=+1030.765971155" Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.529150 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.542492 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8666d45c85-h72w2"] Jan 30 18:47:34 crc kubenswrapper[4782]: I0130 18:47:34.763860 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 18:47:35 crc kubenswrapper[4782]: I0130 18:47:35.414924 4782 generic.go:334] "Generic (PLEG): container finished" podID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerID="4cc769be6d390810ecd41ff54dfb748892cb5b5e4147d5bbde48823235e1945a" exitCode=0 Jan 30 18:47:35 crc kubenswrapper[4782]: I0130 18:47:35.414982 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerDied","Data":"4cc769be6d390810ecd41ff54dfb748892cb5b5e4147d5bbde48823235e1945a"} Jan 30 18:47:35 crc kubenswrapper[4782]: I0130 18:47:35.415414 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerStarted","Data":"fb83cbd6b0e8b328ce1008786a222eced5d79485fdf4a18d7bc18dcf71bc4267"} Jan 30 18:47:35 crc kubenswrapper[4782]: I0130 18:47:35.416267 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:36 crc kubenswrapper[4782]: I0130 18:47:36.423603 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" path="/var/lib/kubelet/pods/bbc79520-07bf-4876-a889-9bcd4b3be4c7/volumes" Jan 30 18:47:36 crc kubenswrapper[4782]: I0130 18:47:36.434497 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerStarted","Data":"6340af41e50b610f37ae4f20cbcae1c5bf4ef053a6c7cc71c76a6438f00b6848"} Jan 30 18:47:38 crc kubenswrapper[4782]: I0130 18:47:38.482371 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbdqv" event={"ID":"8a9f5c0e-8d43-437d-b47e-e72f03df077b","Type":"ContainerStarted","Data":"4c1c2a2146c08e0bb522055e996c3439a134a5a131e54a1608d164106871e63f"} Jan 30 18:47:38 crc kubenswrapper[4782]: I0130 18:47:38.502759 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mbdqv" podStartSLOduration=7.20152863 podStartE2EDuration="10.502737774s" podCreationTimestamp="2026-01-30 18:47:28 +0000 UTC" firstStartedPulling="2026-01-30 18:47:34.050001591 +0000 UTC m=+1030.318379616" lastFinishedPulling="2026-01-30 18:47:37.351210725 +0000 UTC m=+1033.619588760" observedRunningTime="2026-01-30 18:47:38.496763106 +0000 UTC m=+1034.765141151" watchObservedRunningTime="2026-01-30 18:47:38.502737774 +0000 UTC m=+1034.771115809" Jan 30 18:47:38 crc kubenswrapper[4782]: I0130 18:47:38.503057 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" podStartSLOduration=15.503050782 podStartE2EDuration="15.503050782s" podCreationTimestamp="2026-01-30 18:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:35.437043973 +0000 UTC m=+1031.705422008" watchObservedRunningTime="2026-01-30 18:47:38.503050782 +0000 UTC m=+1034.771428817" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.210940 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-l6fxk"] Jan 30 18:47:39 crc kubenswrapper[4782]: E0130 18:47:39.211284 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.211301 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" Jan 30 18:47:39 crc kubenswrapper[4782]: E0130 18:47:39.211336 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="init" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.211343 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="init" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.211497 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc79520-07bf-4876-a889-9bcd4b3be4c7" containerName="dnsmasq-dns" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.212003 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.214951 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.226125 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l6fxk"] Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.260339 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.260725 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2nq\" (UniqueName: \"kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: E0130 18:47:39.271006 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.271914 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.362084 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.362386 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2nq\" (UniqueName: \"kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.364941 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.382563 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2nq\" (UniqueName: \"kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq\") pod \"root-account-create-update-l6fxk\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.493699 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerStarted","Data":"e060c5ad3fd1db961e0960e03d13a7b2f1c40090cda4dc28fea7a058dfb1eaa2"} Jan 30 18:47:39 crc kubenswrapper[4782]: E0130 18:47:39.495376 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" Jan 30 18:47:39 crc kubenswrapper[4782]: I0130 18:47:39.546866 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.001292 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l6fxk"] Jan 30 18:47:40 crc kubenswrapper[4782]: W0130 18:47:40.005412 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4194658_d823_4c23_86fc_2ea3221bcd19.slice/crio-18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f WatchSource:0}: Error finding container 18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f: Status 404 returned error can't find the container with id 18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.289343 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xj8zj"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.290379 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.299834 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xj8zj"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.368150 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d17b-account-create-update-lswx2"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.369688 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.372006 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.387759 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6866\" (UniqueName: \"kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.387822 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.387866 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2d5\" (UniqueName: \"kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.387883 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.387927 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d17b-account-create-update-lswx2"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.488517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.488683 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6866\" (UniqueName: \"kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.488713 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.488742 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.488775 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2d5\" (UniqueName: \"kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: E0130 18:47:40.489192 4782 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 18:47:40 crc kubenswrapper[4782]: E0130 18:47:40.489216 4782 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 18:47:40 crc kubenswrapper[4782]: E0130 18:47:40.489295 4782 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift podName:15ef1358-db2b-4935-b53c-7aad2613cee7 nodeName:}" failed. No retries permitted until 2026-01-30 18:47:56.489277298 +0000 UTC m=+1052.757655323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift") pod "swift-storage-0" (UID: "15ef1358-db2b-4935-b53c-7aad2613cee7") : configmap "swift-ring-files" not found Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.489800 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.490126 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.511817 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l6fxk" event={"ID":"b4194658-d823-4c23-86fc-2ea3221bcd19","Type":"ContainerStarted","Data":"32774d969d034474cf88dae3b5c042e29584f630eda4fec4dc8a5aee101211af"} Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.511918 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l6fxk" event={"ID":"b4194658-d823-4c23-86fc-2ea3221bcd19","Type":"ContainerStarted","Data":"18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f"} Jan 30 18:47:40 crc kubenswrapper[4782]: E0130 18:47:40.514175 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741\\\"\"" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.515135 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6866\" (UniqueName: \"kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866\") pod \"keystone-d17b-account-create-update-lswx2\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.518489 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2d5\" (UniqueName: \"kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5\") pod \"keystone-db-create-xj8zj\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.542454 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-l6fxk" podStartSLOduration=1.542428825 podStartE2EDuration="1.542428825s" podCreationTimestamp="2026-01-30 18:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:40.530258393 +0000 UTC m=+1036.798636438" watchObservedRunningTime="2026-01-30 18:47:40.542428825 +0000 UTC m=+1036.810806850" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.611842 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.640037 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m5tvv"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.641503 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.652514 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5tvv"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.686037 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.768011 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2b50-account-create-update-sbscq"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.769413 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.780755 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.797856 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnpc\" (UniqueName: \"kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.797905 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.799610 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2b50-account-create-update-sbscq"] Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.900097 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnpc\" (UniqueName: \"kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.900179 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.900410 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.900448 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26f82\" (UniqueName: \"kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.901121 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:40 crc kubenswrapper[4782]: I0130 18:47:40.920631 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnpc\" (UniqueName: \"kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc\") pod \"placement-db-create-m5tvv\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.002424 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.002853 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26f82\" (UniqueName: \"kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.006190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.020665 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26f82\" (UniqueName: \"kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82\") pod \"placement-2b50-account-create-update-sbscq\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.032786 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.097587 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.117772 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xj8zj"] Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.293481 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d17b-account-create-update-lswx2"] Jan 30 18:47:41 crc kubenswrapper[4782]: W0130 18:47:41.316774 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5676c98c_0259_43b5_a04f_12e9f8f74746.slice/crio-c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8 WatchSource:0}: Error finding container c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8: Status 404 returned error can't find the container with id c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8 Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.418159 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.430499 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bk7c8" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.433301 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2b50-account-create-update-sbscq"] Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.439497 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pz2pk" podUID="91d457a1-1878-47f1-a1d3-eac450864978" containerName="ovn-controller" probeResult="failure" output=< Jan 30 18:47:41 crc kubenswrapper[4782]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 18:47:41 crc kubenswrapper[4782]: > Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.526164 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d17b-account-create-update-lswx2" event={"ID":"5676c98c-0259-43b5-a04f-12e9f8f74746","Type":"ContainerStarted","Data":"c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8"} Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.531931 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2b50-account-create-update-sbscq" event={"ID":"80d82414-65bf-4612-8255-097d4e82b25b","Type":"ContainerStarted","Data":"f5f4c1d198a7a2257fad10a38bd99240c59984dfc023173851b869afc2bc41e6"} Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.533570 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5tvv"] Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.536568 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l6fxk" event={"ID":"b4194658-d823-4c23-86fc-2ea3221bcd19","Type":"ContainerDied","Data":"32774d969d034474cf88dae3b5c042e29584f630eda4fec4dc8a5aee101211af"} Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.537011 4782 generic.go:334] "Generic (PLEG): container finished" podID="b4194658-d823-4c23-86fc-2ea3221bcd19" containerID="32774d969d034474cf88dae3b5c042e29584f630eda4fec4dc8a5aee101211af" exitCode=0 Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.538539 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xj8zj" event={"ID":"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1","Type":"ContainerStarted","Data":"b183169d7e6af3e3159dcba10baa92df856271979556ec5a652b91f7470e9c0e"} Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.538586 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xj8zj" event={"ID":"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1","Type":"ContainerStarted","Data":"aea3df0697757c7dd634b01af76e7fd4e2df5020d6e1c3aa4415806c4fcea10e"} Jan 30 18:47:41 crc kubenswrapper[4782]: W0130 18:47:41.547486 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18bb25b_4060_4f0d_8856_3e90a46209d9.slice/crio-d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096 WatchSource:0}: Error finding container d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096: Status 404 returned error can't find the container with id d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096 Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.576515 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-xj8zj" podStartSLOduration=1.576497013 podStartE2EDuration="1.576497013s" podCreationTimestamp="2026-01-30 18:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:41.570213718 +0000 UTC m=+1037.838591733" watchObservedRunningTime="2026-01-30 18:47:41.576497013 +0000 UTC m=+1037.844875028" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.704130 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pz2pk-config-kzt78"] Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.705134 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.715039 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720061 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720135 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720191 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvbx\" (UniqueName: \"kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720292 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720397 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.720450 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.726752 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk-config-kzt78"] Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.822727 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvbx\" (UniqueName: \"kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823030 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823283 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823394 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823496 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823593 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823846 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823849 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823852 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.823885 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.826050 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:41 crc kubenswrapper[4782]: I0130 18:47:41.843538 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvbx\" (UniqueName: \"kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx\") pod \"ovn-controller-pz2pk-config-kzt78\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.041016 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.500482 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk-config-kzt78"] Jan 30 18:47:42 crc kubenswrapper[4782]: W0130 18:47:42.503732 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97b3603d_9188_4747_8404_2797c916d9e0.slice/crio-a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371 WatchSource:0}: Error finding container a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371: Status 404 returned error can't find the container with id a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371 Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.552750 4782 generic.go:334] "Generic (PLEG): container finished" podID="b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" containerID="b183169d7e6af3e3159dcba10baa92df856271979556ec5a652b91f7470e9c0e" exitCode=0 Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.553804 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xj8zj" event={"ID":"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1","Type":"ContainerDied","Data":"b183169d7e6af3e3159dcba10baa92df856271979556ec5a652b91f7470e9c0e"} Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.557145 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-kzt78" event={"ID":"97b3603d-9188-4747-8404-2797c916d9e0","Type":"ContainerStarted","Data":"a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371"} Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.559106 4782 generic.go:334] "Generic (PLEG): container finished" podID="b18bb25b-4060-4f0d-8856-3e90a46209d9" containerID="c76799fd89750a18b8e322f9aa27474cace0504e98179f692441c6108201ddfb" exitCode=0 Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.559361 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5tvv" event={"ID":"b18bb25b-4060-4f0d-8856-3e90a46209d9","Type":"ContainerDied","Data":"c76799fd89750a18b8e322f9aa27474cace0504e98179f692441c6108201ddfb"} Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.559486 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5tvv" event={"ID":"b18bb25b-4060-4f0d-8856-3e90a46209d9","Type":"ContainerStarted","Data":"d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096"} Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.561545 4782 generic.go:334] "Generic (PLEG): container finished" podID="5676c98c-0259-43b5-a04f-12e9f8f74746" containerID="765683b4de05082bbce5d7d676196e5d33546263498f7e3b30d6630bc3b582df" exitCode=0 Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.561702 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d17b-account-create-update-lswx2" event={"ID":"5676c98c-0259-43b5-a04f-12e9f8f74746","Type":"ContainerDied","Data":"765683b4de05082bbce5d7d676196e5d33546263498f7e3b30d6630bc3b582df"} Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.581532 4782 generic.go:334] "Generic (PLEG): container finished" podID="80d82414-65bf-4612-8255-097d4e82b25b" containerID="11f9764b35b7c74af148cec5204cd14d3224b9f03d6b3a2b0344235f061774d3" exitCode=0 Jan 30 18:47:42 crc kubenswrapper[4782]: I0130 18:47:42.581628 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2b50-account-create-update-sbscq" event={"ID":"80d82414-65bf-4612-8255-097d4e82b25b","Type":"ContainerDied","Data":"11f9764b35b7c74af148cec5204cd14d3224b9f03d6b3a2b0344235f061774d3"} Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.122527 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.162806 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts\") pod \"b4194658-d823-4c23-86fc-2ea3221bcd19\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.162889 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl2nq\" (UniqueName: \"kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq\") pod \"b4194658-d823-4c23-86fc-2ea3221bcd19\" (UID: \"b4194658-d823-4c23-86fc-2ea3221bcd19\") " Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.164682 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4194658-d823-4c23-86fc-2ea3221bcd19" (UID: "b4194658-d823-4c23-86fc-2ea3221bcd19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.169434 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq" (OuterVolumeSpecName: "kube-api-access-sl2nq") pod "b4194658-d823-4c23-86fc-2ea3221bcd19" (UID: "b4194658-d823-4c23-86fc-2ea3221bcd19"). InnerVolumeSpecName "kube-api-access-sl2nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.265291 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4194658-d823-4c23-86fc-2ea3221bcd19-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.265325 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl2nq\" (UniqueName: \"kubernetes.io/projected/b4194658-d823-4c23-86fc-2ea3221bcd19-kube-api-access-sl2nq\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.380205 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-svqmg"] Jan 30 18:47:43 crc kubenswrapper[4782]: E0130 18:47:43.380565 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4194658-d823-4c23-86fc-2ea3221bcd19" containerName="mariadb-account-create-update" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.380580 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4194658-d823-4c23-86fc-2ea3221bcd19" containerName="mariadb-account-create-update" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.380757 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4194658-d823-4c23-86fc-2ea3221bcd19" containerName="mariadb-account-create-update" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.381323 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.396111 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-75b9-account-create-update-sdvxj"] Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.397658 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.400871 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.402603 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-svqmg"] Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.409270 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-75b9-account-create-update-sdvxj"] Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.468662 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.468954 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp689\" (UniqueName: \"kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.469101 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlngv\" (UniqueName: \"kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.469258 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.565429 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.570356 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.570401 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp689\" (UniqueName: \"kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.570427 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlngv\" (UniqueName: \"kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.570483 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.571366 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.571534 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.591934 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp689\" (UniqueName: \"kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689\") pod \"watcher-db-create-svqmg\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.592115 4782 generic.go:334] "Generic (PLEG): container finished" podID="97b3603d-9188-4747-8404-2797c916d9e0" containerID="5bee7ff4bdab2b1e98536ca8e9369f2b7f05e4fcf60259e2742a3240d8fd2d4f" exitCode=0 Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.592182 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-kzt78" event={"ID":"97b3603d-9188-4747-8404-2797c916d9e0","Type":"ContainerDied","Data":"5bee7ff4bdab2b1e98536ca8e9369f2b7f05e4fcf60259e2742a3240d8fd2d4f"} Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.593926 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l6fxk" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.594219 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l6fxk" event={"ID":"b4194658-d823-4c23-86fc-2ea3221bcd19","Type":"ContainerDied","Data":"18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f"} Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.594421 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18a2ca523005df6f229671915ed378016326dd8cb37d4f7de7c8b8510da1e89f" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.598053 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlngv\" (UniqueName: \"kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv\") pod \"watcher-75b9-account-create-update-sdvxj\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.675240 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.675522 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="dnsmasq-dns" containerID="cri-o://aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d" gracePeriod=10 Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.708573 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:43 crc kubenswrapper[4782]: I0130 18:47:43.719646 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.069408 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.179762 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts\") pod \"80d82414-65bf-4612-8255-097d4e82b25b\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.179815 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26f82\" (UniqueName: \"kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82\") pod \"80d82414-65bf-4612-8255-097d4e82b25b\" (UID: \"80d82414-65bf-4612-8255-097d4e82b25b\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.181865 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80d82414-65bf-4612-8255-097d4e82b25b" (UID: "80d82414-65bf-4612-8255-097d4e82b25b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.185375 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82" (OuterVolumeSpecName: "kube-api-access-26f82") pod "80d82414-65bf-4612-8255-097d4e82b25b" (UID: "80d82414-65bf-4612-8255-097d4e82b25b"). InnerVolumeSpecName "kube-api-access-26f82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.282160 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80d82414-65bf-4612-8255-097d4e82b25b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.282441 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26f82\" (UniqueName: \"kubernetes.io/projected/80d82414-65bf-4612-8255-097d4e82b25b-kube-api-access-26f82\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.418516 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.425904 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.429029 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486672 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts\") pod \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486720 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6866\" (UniqueName: \"kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866\") pod \"5676c98c-0259-43b5-a04f-12e9f8f74746\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486758 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2d5\" (UniqueName: \"kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5\") pod \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\" (UID: \"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486773 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tnpc\" (UniqueName: \"kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc\") pod \"b18bb25b-4060-4f0d-8856-3e90a46209d9\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486814 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts\") pod \"5676c98c-0259-43b5-a04f-12e9f8f74746\" (UID: \"5676c98c-0259-43b5-a04f-12e9f8f74746\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.486834 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts\") pod \"b18bb25b-4060-4f0d-8856-3e90a46209d9\" (UID: \"b18bb25b-4060-4f0d-8856-3e90a46209d9\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.487467 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b18bb25b-4060-4f0d-8856-3e90a46209d9" (UID: "b18bb25b-4060-4f0d-8856-3e90a46209d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.487804 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" (UID: "b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.489322 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5676c98c-0259-43b5-a04f-12e9f8f74746" (UID: "5676c98c-0259-43b5-a04f-12e9f8f74746"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.496834 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866" (OuterVolumeSpecName: "kube-api-access-c6866") pod "5676c98c-0259-43b5-a04f-12e9f8f74746" (UID: "5676c98c-0259-43b5-a04f-12e9f8f74746"). InnerVolumeSpecName "kube-api-access-c6866". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.497063 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5" (OuterVolumeSpecName: "kube-api-access-cp2d5") pod "b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" (UID: "b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1"). InnerVolumeSpecName "kube-api-access-cp2d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.497118 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc" (OuterVolumeSpecName: "kube-api-access-2tnpc") pod "b18bb25b-4060-4f0d-8856-3e90a46209d9" (UID: "b18bb25b-4060-4f0d-8856-3e90a46209d9"). InnerVolumeSpecName "kube-api-access-2tnpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.498012 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.587809 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb\") pod \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.588747 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config\") pod \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.588824 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwmnw\" (UniqueName: \"kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw\") pod \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.588943 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb\") pod \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589091 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc\") pod \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\" (UID: \"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f\") " Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589405 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5676c98c-0259-43b5-a04f-12e9f8f74746-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589478 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b18bb25b-4060-4f0d-8856-3e90a46209d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589533 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589600 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6866\" (UniqueName: \"kubernetes.io/projected/5676c98c-0259-43b5-a04f-12e9f8f74746-kube-api-access-c6866\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589664 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tnpc\" (UniqueName: \"kubernetes.io/projected/b18bb25b-4060-4f0d-8856-3e90a46209d9-kube-api-access-2tnpc\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.589726 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2d5\" (UniqueName: \"kubernetes.io/projected/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1-kube-api-access-cp2d5\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.596044 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw" (OuterVolumeSpecName: "kube-api-access-pwmnw") pod "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" (UID: "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f"). InnerVolumeSpecName "kube-api-access-pwmnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.630743 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xj8zj" event={"ID":"b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1","Type":"ContainerDied","Data":"aea3df0697757c7dd634b01af76e7fd4e2df5020d6e1c3aa4415806c4fcea10e"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.630787 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea3df0697757c7dd634b01af76e7fd4e2df5020d6e1c3aa4415806c4fcea10e" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.630897 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xj8zj" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.636355 4782 generic.go:334] "Generic (PLEG): container finished" podID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerID="aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d" exitCode=0 Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.636430 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" event={"ID":"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f","Type":"ContainerDied","Data":"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.636461 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" event={"ID":"7a28a7e1-572a-488d-a38f-3fe4dacf6b4f","Type":"ContainerDied","Data":"6db36d287ce44ef8e468d69b9d2df5aa08f04740847ff02590055e100bae3fb0"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.636477 4782 scope.go:117] "RemoveContainer" containerID="aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.636646 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56f5b65c6f-ms4px" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.639529 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5tvv" event={"ID":"b18bb25b-4060-4f0d-8856-3e90a46209d9","Type":"ContainerDied","Data":"d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.639571 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27a5b6ea5e476d34e7cf1bcc62a8eb94b3f724fc7e52db6cb022e9502afd096" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.639650 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5tvv" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.658788 4782 scope.go:117] "RemoveContainer" containerID="826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.660973 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d17b-account-create-update-lswx2" event={"ID":"5676c98c-0259-43b5-a04f-12e9f8f74746","Type":"ContainerDied","Data":"c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.661010 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5dfaf028c8bcd566aa1d61c501293a490437baeda0b608d96b8ecbab9f4b2f8" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.661072 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d17b-account-create-update-lswx2" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.661179 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" (UID: "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.664303 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2b50-account-create-update-sbscq" event={"ID":"80d82414-65bf-4612-8255-097d4e82b25b","Type":"ContainerDied","Data":"f5f4c1d198a7a2257fad10a38bd99240c59984dfc023173851b869afc2bc41e6"} Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.664350 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5f4c1d198a7a2257fad10a38bd99240c59984dfc023173851b869afc2bc41e6" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.663814 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2b50-account-create-update-sbscq" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.669393 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config" (OuterVolumeSpecName: "config") pod "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" (UID: "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.684134 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" (UID: "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.693336 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-75b9-account-create-update-sdvxj"] Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.694269 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.694977 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.695077 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.695144 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwmnw\" (UniqueName: \"kubernetes.io/projected/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-kube-api-access-pwmnw\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.703721 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-svqmg"] Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.713037 4782 scope.go:117] "RemoveContainer" containerID="aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d" Jan 30 18:47:44 crc kubenswrapper[4782]: E0130 18:47:44.713763 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d\": container with ID starting with aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d not found: ID does not exist" containerID="aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.713791 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d"} err="failed to get container status \"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d\": rpc error: code = NotFound desc = could not find container \"aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d\": container with ID starting with aecac590201279e8f2d416cd74fa84794ead5ae317484052ef5377cb8c84047d not found: ID does not exist" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.713811 4782 scope.go:117] "RemoveContainer" containerID="826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06" Jan 30 18:47:44 crc kubenswrapper[4782]: E0130 18:47:44.714477 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06\": container with ID starting with 826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06 not found: ID does not exist" containerID="826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.714600 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06"} err="failed to get container status \"826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06\": rpc error: code = NotFound desc = could not find container \"826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06\": container with ID starting with 826530b4c01611262696cbd28f6a7f5948ce7a4fce222bb1b39cf73ff42c8e06 not found: ID does not exist" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.734756 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" (UID: "7a28a7e1-572a-488d-a38f-3fe4dacf6b4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.798248 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.977681 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.981644 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:44 crc kubenswrapper[4782]: I0130 18:47:44.983715 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56f5b65c6f-ms4px"] Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001530 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001605 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001653 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001672 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001700 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.001719 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvbx\" (UniqueName: \"kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx\") pod \"97b3603d-9188-4747-8404-2797c916d9e0\" (UID: \"97b3603d-9188-4747-8404-2797c916d9e0\") " Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.002982 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run" (OuterVolumeSpecName: "var-run") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.003454 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.004052 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts" (OuterVolumeSpecName: "scripts") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.004082 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.004099 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.005539 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx" (OuterVolumeSpecName: "kube-api-access-5qvbx") pod "97b3603d-9188-4747-8404-2797c916d9e0" (UID: "97b3603d-9188-4747-8404-2797c916d9e0"). InnerVolumeSpecName "kube-api-access-5qvbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103397 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103435 4782 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103448 4782 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103463 4782 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/97b3603d-9188-4747-8404-2797c916d9e0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103474 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qvbx\" (UniqueName: \"kubernetes.io/projected/97b3603d-9188-4747-8404-2797c916d9e0-kube-api-access-5qvbx\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.103486 4782 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/97b3603d-9188-4747-8404-2797c916d9e0-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.672919 4782 generic.go:334] "Generic (PLEG): container finished" podID="452e963a-2af0-417d-a9f2-4ce3490829e3" containerID="5da084ba64db7ade035b8ed1dd0a82f59c530670ca4f5e1c2fd28bbf8cec3adf" exitCode=0 Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.673012 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-75b9-account-create-update-sdvxj" event={"ID":"452e963a-2af0-417d-a9f2-4ce3490829e3","Type":"ContainerDied","Data":"5da084ba64db7ade035b8ed1dd0a82f59c530670ca4f5e1c2fd28bbf8cec3adf"} Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.673055 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-75b9-account-create-update-sdvxj" event={"ID":"452e963a-2af0-417d-a9f2-4ce3490829e3","Type":"ContainerStarted","Data":"351671d68ab00e4248fe37a4514785af8589ffeebd200dcf68a36aa82bee5f45"} Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.674654 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-kzt78" event={"ID":"97b3603d-9188-4747-8404-2797c916d9e0","Type":"ContainerDied","Data":"a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371"} Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.674702 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a306b1e47c2cc8d474f1d439300c28de23c23490978ca6c1fd63ccdeeb181371" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.674674 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-kzt78" Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.677080 4782 generic.go:334] "Generic (PLEG): container finished" podID="c449eb35-703e-4c85-b4ec-52918bedb59d" containerID="e8123a25d952fe4b1c57a157863df1d87c46b680beabbe16a538608abe0b8bfc" exitCode=0 Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.677127 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-svqmg" event={"ID":"c449eb35-703e-4c85-b4ec-52918bedb59d","Type":"ContainerDied","Data":"e8123a25d952fe4b1c57a157863df1d87c46b680beabbe16a538608abe0b8bfc"} Jan 30 18:47:45 crc kubenswrapper[4782]: I0130 18:47:45.677364 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-svqmg" event={"ID":"c449eb35-703e-4c85-b4ec-52918bedb59d","Type":"ContainerStarted","Data":"986cb7b4088117c3fc1df877b8a21c65e5986592696892061a0655db0d20ce27"} Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.126685 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pz2pk-config-kzt78"] Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.132983 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pz2pk-config-kzt78"] Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.283896 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pz2pk-config-4s5gq"] Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284342 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="init" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284365 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="init" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284386 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b3603d-9188-4747-8404-2797c916d9e0" containerName="ovn-config" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284393 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b3603d-9188-4747-8404-2797c916d9e0" containerName="ovn-config" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284412 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5676c98c-0259-43b5-a04f-12e9f8f74746" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284419 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5676c98c-0259-43b5-a04f-12e9f8f74746" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284433 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284440 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284454 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="dnsmasq-dns" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284462 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="dnsmasq-dns" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284478 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d82414-65bf-4612-8255-097d4e82b25b" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284486 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d82414-65bf-4612-8255-097d4e82b25b" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: E0130 18:47:46.284495 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18bb25b-4060-4f0d-8856-3e90a46209d9" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284503 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18bb25b-4060-4f0d-8856-3e90a46209d9" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284683 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18bb25b-4060-4f0d-8856-3e90a46209d9" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284707 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" containerName="mariadb-database-create" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284721 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" containerName="dnsmasq-dns" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284732 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d82414-65bf-4612-8255-097d4e82b25b" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284751 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5676c98c-0259-43b5-a04f-12e9f8f74746" containerName="mariadb-account-create-update" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.284763 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b3603d-9188-4747-8404-2797c916d9e0" containerName="ovn-config" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.285506 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.287205 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.305500 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk-config-4s5gq"] Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.361316 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-pz2pk" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.419473 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a28a7e1-572a-488d-a38f-3fe4dacf6b4f" path="/var/lib/kubelet/pods/7a28a7e1-572a-488d-a38f-3fe4dacf6b4f/volumes" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.420164 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b3603d-9188-4747-8404-2797c916d9e0" path="/var/lib/kubelet/pods/97b3603d-9188-4747-8404-2797c916d9e0/volumes" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423491 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423527 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423567 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423595 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq98p\" (UniqueName: \"kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423643 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.423678 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525352 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525483 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525530 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525637 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq98p\" (UniqueName: \"kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525669 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525689 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525713 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.525757 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.527049 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.527530 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.544382 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq98p\" (UniqueName: \"kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p\") pod \"ovn-controller-pz2pk-config-4s5gq\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.607297 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.686915 4782 generic.go:334] "Generic (PLEG): container finished" podID="8a9f5c0e-8d43-437d-b47e-e72f03df077b" containerID="4c1c2a2146c08e0bb522055e996c3439a134a5a131e54a1608d164106871e63f" exitCode=0 Jan 30 18:47:46 crc kubenswrapper[4782]: I0130 18:47:46.687083 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbdqv" event={"ID":"8a9f5c0e-8d43-437d-b47e-e72f03df077b","Type":"ContainerDied","Data":"4c1c2a2146c08e0bb522055e996c3439a134a5a131e54a1608d164106871e63f"} Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.039922 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.077884 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pz2pk-config-4s5gq"] Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.139913 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.145521 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.247729 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp689\" (UniqueName: \"kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689\") pod \"c449eb35-703e-4c85-b4ec-52918bedb59d\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.247819 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlngv\" (UniqueName: \"kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv\") pod \"452e963a-2af0-417d-a9f2-4ce3490829e3\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.247890 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts\") pod \"452e963a-2af0-417d-a9f2-4ce3490829e3\" (UID: \"452e963a-2af0-417d-a9f2-4ce3490829e3\") " Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.247940 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts\") pod \"c449eb35-703e-4c85-b4ec-52918bedb59d\" (UID: \"c449eb35-703e-4c85-b4ec-52918bedb59d\") " Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.248607 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "452e963a-2af0-417d-a9f2-4ce3490829e3" (UID: "452e963a-2af0-417d-a9f2-4ce3490829e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.248641 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c449eb35-703e-4c85-b4ec-52918bedb59d" (UID: "c449eb35-703e-4c85-b4ec-52918bedb59d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.254377 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv" (OuterVolumeSpecName: "kube-api-access-jlngv") pod "452e963a-2af0-417d-a9f2-4ce3490829e3" (UID: "452e963a-2af0-417d-a9f2-4ce3490829e3"). InnerVolumeSpecName "kube-api-access-jlngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.254944 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689" (OuterVolumeSpecName: "kube-api-access-kp689") pod "c449eb35-703e-4c85-b4ec-52918bedb59d" (UID: "c449eb35-703e-4c85-b4ec-52918bedb59d"). InnerVolumeSpecName "kube-api-access-kp689". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.319826 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.350168 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/452e963a-2af0-417d-a9f2-4ce3490829e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.350206 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c449eb35-703e-4c85-b4ec-52918bedb59d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.350221 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp689\" (UniqueName: \"kubernetes.io/projected/c449eb35-703e-4c85-b4ec-52918bedb59d-kube-api-access-kp689\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.350251 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlngv\" (UniqueName: \"kubernetes.io/projected/452e963a-2af0-417d-a9f2-4ce3490829e3-kube-api-access-jlngv\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.637205 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="9e3b2844-afde-444d-b7ee-cddd8b543bf6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.657694 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-l6fxk"] Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.668434 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-l6fxk"] Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.696122 4782 generic.go:334] "Generic (PLEG): container finished" podID="c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" containerID="e711fa721ad0a82a8f463a6ab7d1c38710fc1b9a7e3d2c723b82eee21147acbb" exitCode=0 Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.696193 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-4s5gq" event={"ID":"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9","Type":"ContainerDied","Data":"e711fa721ad0a82a8f463a6ab7d1c38710fc1b9a7e3d2c723b82eee21147acbb"} Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.696252 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-4s5gq" event={"ID":"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9","Type":"ContainerStarted","Data":"bcbaa4c7121574c6e1780884edd83082667c45148b91ac90a94d843fe91ba082"} Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.698156 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-svqmg" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.698155 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-svqmg" event={"ID":"c449eb35-703e-4c85-b4ec-52918bedb59d","Type":"ContainerDied","Data":"986cb7b4088117c3fc1df877b8a21c65e5986592696892061a0655db0d20ce27"} Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.698203 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="986cb7b4088117c3fc1df877b8a21c65e5986592696892061a0655db0d20ce27" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.700078 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-75b9-account-create-update-sdvxj" event={"ID":"452e963a-2af0-417d-a9f2-4ce3490829e3","Type":"ContainerDied","Data":"351671d68ab00e4248fe37a4514785af8589ffeebd200dcf68a36aa82bee5f45"} Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.700117 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="351671d68ab00e4248fe37a4514785af8589ffeebd200dcf68a36aa82bee5f45" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.700128 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-75b9-account-create-update-sdvxj" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.749798 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-49bsv"] Jan 30 18:47:47 crc kubenswrapper[4782]: E0130 18:47:47.750134 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452e963a-2af0-417d-a9f2-4ce3490829e3" containerName="mariadb-account-create-update" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.750150 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="452e963a-2af0-417d-a9f2-4ce3490829e3" containerName="mariadb-account-create-update" Jan 30 18:47:47 crc kubenswrapper[4782]: E0130 18:47:47.750162 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c449eb35-703e-4c85-b4ec-52918bedb59d" containerName="mariadb-database-create" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.750169 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c449eb35-703e-4c85-b4ec-52918bedb59d" containerName="mariadb-database-create" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.750350 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c449eb35-703e-4c85-b4ec-52918bedb59d" containerName="mariadb-database-create" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.750364 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="452e963a-2af0-417d-a9f2-4ce3490829e3" containerName="mariadb-account-create-update" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.750946 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.753597 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.786288 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-49bsv"] Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.863301 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.863653 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9slmd\" (UniqueName: \"kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.965586 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.965643 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9slmd\" (UniqueName: \"kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.966770 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:47 crc kubenswrapper[4782]: I0130 18:47:47.987399 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9slmd\" (UniqueName: \"kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd\") pod \"root-account-create-update-49bsv\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.055060 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.082952 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.167835 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.167951 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.167987 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.168028 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmfl\" (UniqueName: \"kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.168064 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.168085 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.168166 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices\") pod \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\" (UID: \"8a9f5c0e-8d43-437d-b47e-e72f03df077b\") " Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.169058 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.169492 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.178476 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl" (OuterVolumeSpecName: "kube-api-access-flmfl") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "kube-api-access-flmfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.187400 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.200127 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.202803 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts" (OuterVolumeSpecName: "scripts") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.213385 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a9f5c0e-8d43-437d-b47e-e72f03df077b" (UID: "8a9f5c0e-8d43-437d-b47e-e72f03df077b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269881 4782 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269912 4782 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269923 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9f5c0e-8d43-437d-b47e-e72f03df077b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269933 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmfl\" (UniqueName: \"kubernetes.io/projected/8a9f5c0e-8d43-437d-b47e-e72f03df077b-kube-api-access-flmfl\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269942 4782 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9f5c0e-8d43-437d-b47e-e72f03df077b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269950 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.269958 4782 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9f5c0e-8d43-437d-b47e-e72f03df077b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.420245 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4194658-d823-4c23-86fc-2ea3221bcd19" path="/var/lib/kubelet/pods/b4194658-d823-4c23-86fc-2ea3221bcd19/volumes" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.588458 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-49bsv"] Jan 30 18:47:48 crc kubenswrapper[4782]: W0130 18:47:48.602222 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d0227d7_251c_4e1b_ac70_2c923e735208.slice/crio-035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53 WatchSource:0}: Error finding container 035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53: Status 404 returned error can't find the container with id 035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53 Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.707413 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-49bsv" event={"ID":"0d0227d7-251c-4e1b-ac70-2c923e735208","Type":"ContainerStarted","Data":"035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53"} Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.708965 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mbdqv" Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.709795 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mbdqv" event={"ID":"8a9f5c0e-8d43-437d-b47e-e72f03df077b","Type":"ContainerDied","Data":"2e7f31a7fb305d28537d309ff34371904703cb4b1bfacf448cf2850a8686c79c"} Jan 30 18:47:48 crc kubenswrapper[4782]: I0130 18:47:48.709823 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e7f31a7fb305d28537d309ff34371904703cb4b1bfacf448cf2850a8686c79c" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.019141 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086732 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq98p\" (UniqueName: \"kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086778 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086806 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086910 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086945 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.086990 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts\") pod \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\" (UID: \"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9\") " Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.088048 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.088100 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.088120 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run" (OuterVolumeSpecName: "var-run") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.088137 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.088184 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts" (OuterVolumeSpecName: "scripts") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.095390 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p" (OuterVolumeSpecName: "kube-api-access-gq98p") pod "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" (UID: "c952ea6b-bc9b-42d2-8374-eea7ba8c51f9"). InnerVolumeSpecName "kube-api-access-gq98p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188530 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188560 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq98p\" (UniqueName: \"kubernetes.io/projected/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-kube-api-access-gq98p\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188569 4782 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188579 4782 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188589 4782 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.188597 4782 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.722825 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pz2pk-config-4s5gq" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.723594 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pz2pk-config-4s5gq" event={"ID":"c952ea6b-bc9b-42d2-8374-eea7ba8c51f9","Type":"ContainerDied","Data":"bcbaa4c7121574c6e1780884edd83082667c45148b91ac90a94d843fe91ba082"} Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.723646 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbaa4c7121574c6e1780884edd83082667c45148b91ac90a94d843fe91ba082" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.728746 4782 generic.go:334] "Generic (PLEG): container finished" podID="0d0227d7-251c-4e1b-ac70-2c923e735208" containerID="e9c029ed1031ab8f9741c88e25f7433f0516f565f0ca555b47f2321112cca9d0" exitCode=0 Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.728798 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-49bsv" event={"ID":"0d0227d7-251c-4e1b-ac70-2c923e735208","Type":"ContainerDied","Data":"e9c029ed1031ab8f9741c88e25f7433f0516f565f0ca555b47f2321112cca9d0"} Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.792755 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.793141 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.793216 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.794177 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:47:49 crc kubenswrapper[4782]: I0130 18:47:49.794262 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e" gracePeriod=600 Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.109298 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pz2pk-config-4s5gq"] Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.119496 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pz2pk-config-4s5gq"] Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.421605 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" path="/var/lib/kubelet/pods/c952ea6b-bc9b-42d2-8374-eea7ba8c51f9/volumes" Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.739809 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e" exitCode=0 Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.739895 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e"} Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.739953 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126"} Jan 30 18:47:50 crc kubenswrapper[4782]: I0130 18:47:50.739974 4782 scope.go:117] "RemoveContainer" containerID="78bfa63564c6e38c41b1d267ccc1a5244efd72734f22ff3b1aac2f4b890ae15f" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.096548 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.119196 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts\") pod \"0d0227d7-251c-4e1b-ac70-2c923e735208\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.119343 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9slmd\" (UniqueName: \"kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd\") pod \"0d0227d7-251c-4e1b-ac70-2c923e735208\" (UID: \"0d0227d7-251c-4e1b-ac70-2c923e735208\") " Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.120252 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d0227d7-251c-4e1b-ac70-2c923e735208" (UID: "0d0227d7-251c-4e1b-ac70-2c923e735208"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.127327 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd" (OuterVolumeSpecName: "kube-api-access-9slmd") pod "0d0227d7-251c-4e1b-ac70-2c923e735208" (UID: "0d0227d7-251c-4e1b-ac70-2c923e735208"). InnerVolumeSpecName "kube-api-access-9slmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.221417 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9slmd\" (UniqueName: \"kubernetes.io/projected/0d0227d7-251c-4e1b-ac70-2c923e735208-kube-api-access-9slmd\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.221454 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0227d7-251c-4e1b-ac70-2c923e735208-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.750082 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-49bsv" event={"ID":"0d0227d7-251c-4e1b-ac70-2c923e735208","Type":"ContainerDied","Data":"035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53"} Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.750335 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035e4ca4edae696767b0ab55e5851a72f5a3dc437c6b07da1802975c6f3bee53" Jan 30 18:47:51 crc kubenswrapper[4782]: I0130 18:47:51.750153 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-49bsv" Jan 30 18:47:53 crc kubenswrapper[4782]: I0130 18:47:53.764882 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerStarted","Data":"d53c1f8b8401bd826cb739193815637f91833604fce2905f3e4b49162f0da4ef"} Jan 30 18:47:53 crc kubenswrapper[4782]: I0130 18:47:53.791010 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.082471847 podStartE2EDuration="1m10.790992616s" podCreationTimestamp="2026-01-30 18:46:43 +0000 UTC" firstStartedPulling="2026-01-30 18:46:57.503892974 +0000 UTC m=+993.772270999" lastFinishedPulling="2026-01-30 18:47:53.212413743 +0000 UTC m=+1049.480791768" observedRunningTime="2026-01-30 18:47:53.788145666 +0000 UTC m=+1050.056523701" watchObservedRunningTime="2026-01-30 18:47:53.790992616 +0000 UTC m=+1050.059370651" Jan 30 18:47:54 crc kubenswrapper[4782]: I0130 18:47:54.529537 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 18:47:56 crc kubenswrapper[4782]: I0130 18:47:56.545497 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:56 crc kubenswrapper[4782]: I0130 18:47:56.556321 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15ef1358-db2b-4935-b53c-7aad2613cee7-etc-swift\") pod \"swift-storage-0\" (UID: \"15ef1358-db2b-4935-b53c-7aad2613cee7\") " pod="openstack/swift-storage-0" Jan 30 18:47:56 crc kubenswrapper[4782]: I0130 18:47:56.844364 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.037699 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.320410 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.408649 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-48gwt"] Jan 30 18:47:57 crc kubenswrapper[4782]: E0130 18:47:57.408973 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0227d7-251c-4e1b-ac70-2c923e735208" containerName="mariadb-account-create-update" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.408985 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0227d7-251c-4e1b-ac70-2c923e735208" containerName="mariadb-account-create-update" Jan 30 18:47:57 crc kubenswrapper[4782]: E0130 18:47:57.409008 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" containerName="ovn-config" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409015 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" containerName="ovn-config" Jan 30 18:47:57 crc kubenswrapper[4782]: E0130 18:47:57.409023 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9f5c0e-8d43-437d-b47e-e72f03df077b" containerName="swift-ring-rebalance" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409029 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9f5c0e-8d43-437d-b47e-e72f03df077b" containerName="swift-ring-rebalance" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409167 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c952ea6b-bc9b-42d2-8374-eea7ba8c51f9" containerName="ovn-config" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409177 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9f5c0e-8d43-437d-b47e-e72f03df077b" containerName="swift-ring-rebalance" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409189 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d0227d7-251c-4e1b-ac70-2c923e735208" containerName="mariadb-account-create-update" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.409723 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.431695 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-48gwt"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.478368 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vddgn"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.479539 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.500782 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-cff8-account-create-update-dnxld"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.501912 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.504995 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.510508 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vddgn"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.520704 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cff8-account-create-update-dnxld"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.568327 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.568383 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.568453 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vth5p\" (UniqueName: \"kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.568477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98szt\" (UniqueName: \"kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.573296 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.598356 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-430c-account-create-update-snr88"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.599870 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.601279 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.613697 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-430c-account-create-update-snr88"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.637429 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.671355 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.671440 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxvbj\" (UniqueName: \"kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.671488 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.671533 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vth5p\" (UniqueName: \"kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.671685 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98szt\" (UniqueName: \"kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.672037 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.672135 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.672632 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.702763 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vth5p\" (UniqueName: \"kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p\") pod \"barbican-db-create-vddgn\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.702763 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98szt\" (UniqueName: \"kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt\") pod \"cinder-db-create-48gwt\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.732219 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-48gwt" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.761094 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c4cx2"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.764220 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.769259 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.769452 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.769557 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.769709 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7d8px" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.774145 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c4cx2"] Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.775059 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjvv\" (UniqueName: \"kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.775194 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxvbj\" (UniqueName: \"kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.775219 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.775252 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.777062 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.798027 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vddgn" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.805397 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxvbj\" (UniqueName: \"kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj\") pod \"cinder-cff8-account-create-update-dnxld\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.813930 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.817856 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"5bf4a092746e60f74a8fef7793eeb1e008581011dc723dfb23301334cd41004e"} Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.878499 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjvv\" (UniqueName: \"kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.878582 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.878704 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrl4\" (UniqueName: \"kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.878814 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.878860 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.879650 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.902330 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjvv\" (UniqueName: \"kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv\") pod \"barbican-430c-account-create-update-snr88\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.926545 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.981416 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.981519 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blrl4\" (UniqueName: \"kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.981642 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.990959 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:57 crc kubenswrapper[4782]: I0130 18:47:57.996626 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.007464 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrl4\" (UniqueName: \"kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4\") pod \"keystone-db-sync-c4cx2\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.170744 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cff8-account-create-update-dnxld"] Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.183511 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.279847 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-48gwt"] Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.301882 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vddgn"] Jan 30 18:47:58 crc kubenswrapper[4782]: W0130 18:47:58.319172 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1d4b19_02b2_4c3b_9e1a_eb9505b41503.slice/crio-114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa WatchSource:0}: Error finding container 114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa: Status 404 returned error can't find the container with id 114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.329086 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-430c-account-create-update-snr88"] Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.749218 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c4cx2"] Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.828747 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vddgn" event={"ID":"8bcdb105-a248-440d-89b4-8c4ad0420ecc","Type":"ContainerStarted","Data":"6d375d9916b814b6d8de486a8387ae232b7c4a3f238ca32c98eaf2f71b4265c5"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.829046 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vddgn" event={"ID":"8bcdb105-a248-440d-89b4-8c4ad0420ecc","Type":"ContainerStarted","Data":"739e43083308a403e75f9b868076cbc8d6c539c23a465bbe298b9e8f9cb5f6eb"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.831200 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cff8-account-create-update-dnxld" event={"ID":"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee","Type":"ContainerStarted","Data":"28014afbc1904184eb012bb140f92300a8fc6040f9c157d149230c64231a20bb"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.831250 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cff8-account-create-update-dnxld" event={"ID":"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee","Type":"ContainerStarted","Data":"414e281ac73c56025b5de627f16f8d96eb509a13eb7f162526e998db86550696"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.834443 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-48gwt" event={"ID":"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503","Type":"ContainerStarted","Data":"ec684cc32e76bdf862e44c6145b5c52e5f6176da735b769e01dfbdd2e1617872"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.834468 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-48gwt" event={"ID":"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503","Type":"ContainerStarted","Data":"114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.836173 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-430c-account-create-update-snr88" event={"ID":"d7c7a692-b10f-4799-bae7-8e2d88d26d22","Type":"ContainerStarted","Data":"27b16233d6c6435ae002778664c2f5e610da53503977ac9cbffc76a512829a6a"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.836203 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-430c-account-create-update-snr88" event={"ID":"d7c7a692-b10f-4799-bae7-8e2d88d26d22","Type":"ContainerStarted","Data":"4b0425718ed3ab0164b51ced5546038f97c64859eae4f6254f5d7aa1f86aa8f7"} Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.846199 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-vddgn" podStartSLOduration=1.846179233 podStartE2EDuration="1.846179233s" podCreationTimestamp="2026-01-30 18:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:58.844656665 +0000 UTC m=+1055.113034690" watchObservedRunningTime="2026-01-30 18:47:58.846179233 +0000 UTC m=+1055.114557258" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.861976 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-48gwt" podStartSLOduration=1.861955284 podStartE2EDuration="1.861955284s" podCreationTimestamp="2026-01-30 18:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:58.856082378 +0000 UTC m=+1055.124460403" watchObservedRunningTime="2026-01-30 18:47:58.861955284 +0000 UTC m=+1055.130333309" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.881573 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-430c-account-create-update-snr88" podStartSLOduration=1.8815512989999998 podStartE2EDuration="1.881551299s" podCreationTimestamp="2026-01-30 18:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:58.875304194 +0000 UTC m=+1055.143682219" watchObservedRunningTime="2026-01-30 18:47:58.881551299 +0000 UTC m=+1055.149929324" Jan 30 18:47:58 crc kubenswrapper[4782]: I0130 18:47:58.897151 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-cff8-account-create-update-dnxld" podStartSLOduration=1.897133805 podStartE2EDuration="1.897133805s" podCreationTimestamp="2026-01-30 18:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:47:58.891519016 +0000 UTC m=+1055.159897041" watchObservedRunningTime="2026-01-30 18:47:58.897133805 +0000 UTC m=+1055.165511830" Jan 30 18:47:59 crc kubenswrapper[4782]: W0130 18:47:59.207043 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16e94436_5c08_4fa9_8e93_8929251269ff.slice/crio-c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948 WatchSource:0}: Error finding container c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948: Status 404 returned error can't find the container with id c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948 Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.529907 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.531767 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.845849 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4cx2" event={"ID":"16e94436-5c08-4fa9-8e93-8929251269ff","Type":"ContainerStarted","Data":"c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.874447 4782 generic.go:334] "Generic (PLEG): container finished" podID="dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" containerID="28014afbc1904184eb012bb140f92300a8fc6040f9c157d149230c64231a20bb" exitCode=0 Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.874535 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cff8-account-create-update-dnxld" event={"ID":"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee","Type":"ContainerDied","Data":"28014afbc1904184eb012bb140f92300a8fc6040f9c157d149230c64231a20bb"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.893596 4782 generic.go:334] "Generic (PLEG): container finished" podID="fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" containerID="ec684cc32e76bdf862e44c6145b5c52e5f6176da735b769e01dfbdd2e1617872" exitCode=0 Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.893689 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-48gwt" event={"ID":"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503","Type":"ContainerDied","Data":"ec684cc32e76bdf862e44c6145b5c52e5f6176da735b769e01dfbdd2e1617872"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.895617 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"1610d2692df7fea3d0990521a9742963c7fc9c032f05d3f8bd900bdd6c5374a3"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.895642 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"a99ca03e668b8775cb707b50ff0729c78394fc50faa69feb0e78c2037751a8a9"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.897008 4782 generic.go:334] "Generic (PLEG): container finished" podID="d7c7a692-b10f-4799-bae7-8e2d88d26d22" containerID="27b16233d6c6435ae002778664c2f5e610da53503977ac9cbffc76a512829a6a" exitCode=0 Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.897049 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-430c-account-create-update-snr88" event={"ID":"d7c7a692-b10f-4799-bae7-8e2d88d26d22","Type":"ContainerDied","Data":"27b16233d6c6435ae002778664c2f5e610da53503977ac9cbffc76a512829a6a"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.899801 4782 generic.go:334] "Generic (PLEG): container finished" podID="8bcdb105-a248-440d-89b4-8c4ad0420ecc" containerID="6d375d9916b814b6d8de486a8387ae232b7c4a3f238ca32c98eaf2f71b4265c5" exitCode=0 Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.900946 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vddgn" event={"ID":"8bcdb105-a248-440d-89b4-8c4ad0420ecc","Type":"ContainerDied","Data":"6d375d9916b814b6d8de486a8387ae232b7c4a3f238ca32c98eaf2f71b4265c5"} Jan 30 18:47:59 crc kubenswrapper[4782]: I0130 18:47:59.907607 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.116339 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lqdnd"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.117526 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.132648 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lqdnd"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.171850 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-lnnb7"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.172986 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.176515 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.176922 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-pvpzs" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.179270 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-lnnb7"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.240345 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzqgj\" (UniqueName: \"kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.240703 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.270677 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-cac0-account-create-update-khnrr"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.273000 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.275761 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.284246 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-cac0-account-create-update-khnrr"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341643 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341686 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341704 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kws\" (UniqueName: \"kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341736 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzqgj\" (UniqueName: \"kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341765 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.341836 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.342607 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.360150 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzqgj\" (UniqueName: \"kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj\") pod \"glance-db-create-lqdnd\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.424104 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rkfpr"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.425067 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.441329 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5421-account-create-update-5nkvt"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442561 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442770 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442815 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442839 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kws\" (UniqueName: \"kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442867 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442901 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.442941 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwkzg\" (UniqueName: \"kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.446427 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.446472 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.447812 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.451678 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.463047 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rkfpr"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.469534 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.471828 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5421-account-create-update-5nkvt"] Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.472861 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kws\" (UniqueName: \"kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws\") pod \"watcher-db-sync-lnnb7\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.494129 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.543953 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544034 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfzr\" (UniqueName: \"kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544079 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgll\" (UniqueName: \"kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544109 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544137 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwkzg\" (UniqueName: \"kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544183 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.544954 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.582223 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwkzg\" (UniqueName: \"kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg\") pod \"glance-cac0-account-create-update-khnrr\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.630598 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.646127 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfzr\" (UniqueName: \"kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.646438 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgll\" (UniqueName: \"kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.646458 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.646515 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.647202 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.647667 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.673317 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfzr\" (UniqueName: \"kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr\") pod \"neutron-5421-account-create-update-5nkvt\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.687771 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgll\" (UniqueName: \"kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll\") pod \"neutron-db-create-rkfpr\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.768223 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.794626 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.925965 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"2471b5ebabcfdef6f6cec6a08d7d3ccd40fc99179a5416cfc38d853fb135749d"} Jan 30 18:48:00 crc kubenswrapper[4782]: I0130 18:48:00.926221 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"1d195ab2ebdc1c2982d23915ea5b8caf83b0ed07c2104864c0bf83aa7e3bdfc7"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.115582 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lqdnd"] Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.446822 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-48gwt" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.472664 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.499662 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.507617 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vddgn" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566687 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxvbj\" (UniqueName: \"kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj\") pod \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566726 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts\") pod \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566803 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98szt\" (UniqueName: \"kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt\") pod \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\" (UID: \"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566827 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts\") pod \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566893 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmjvv\" (UniqueName: \"kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv\") pod \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\" (UID: \"d7c7a692-b10f-4799-bae7-8e2d88d26d22\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.566954 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts\") pod \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\" (UID: \"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.567950 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" (UID: "dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.569149 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7c7a692-b10f-4799-bae7-8e2d88d26d22" (UID: "d7c7a692-b10f-4799-bae7-8e2d88d26d22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.569856 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" (UID: "fe1d4b19-02b2-4c3b-9e1a-eb9505b41503"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.573533 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv" (OuterVolumeSpecName: "kube-api-access-lmjvv") pod "d7c7a692-b10f-4799-bae7-8e2d88d26d22" (UID: "d7c7a692-b10f-4799-bae7-8e2d88d26d22"). InnerVolumeSpecName "kube-api-access-lmjvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.573656 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj" (OuterVolumeSpecName: "kube-api-access-vxvbj") pod "dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" (UID: "dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee"). InnerVolumeSpecName "kube-api-access-vxvbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.573878 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt" (OuterVolumeSpecName: "kube-api-access-98szt") pod "fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" (UID: "fe1d4b19-02b2-4c3b-9e1a-eb9505b41503"). InnerVolumeSpecName "kube-api-access-98szt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.578562 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rkfpr"] Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.587741 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5421-account-create-update-5nkvt"] Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.595454 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-cac0-account-create-update-khnrr"] Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.601826 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-lnnb7"] Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.668419 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vth5p\" (UniqueName: \"kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p\") pod \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.668535 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts\") pod \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\" (UID: \"8bcdb105-a248-440d-89b4-8c4ad0420ecc\") " Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669057 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669073 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxvbj\" (UniqueName: \"kubernetes.io/projected/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee-kube-api-access-vxvbj\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669086 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669095 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98szt\" (UniqueName: \"kubernetes.io/projected/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503-kube-api-access-98szt\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669106 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7a692-b10f-4799-bae7-8e2d88d26d22-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669116 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmjvv\" (UniqueName: \"kubernetes.io/projected/d7c7a692-b10f-4799-bae7-8e2d88d26d22-kube-api-access-lmjvv\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.669275 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bcdb105-a248-440d-89b4-8c4ad0420ecc" (UID: "8bcdb105-a248-440d-89b4-8c4ad0420ecc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.673182 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p" (OuterVolumeSpecName: "kube-api-access-vth5p") pod "8bcdb105-a248-440d-89b4-8c4ad0420ecc" (UID: "8bcdb105-a248-440d-89b4-8c4ad0420ecc"). InnerVolumeSpecName "kube-api-access-vth5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.770320 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vth5p\" (UniqueName: \"kubernetes.io/projected/8bcdb105-a248-440d-89b4-8c4ad0420ecc-kube-api-access-vth5p\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.770358 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcdb105-a248-440d-89b4-8c4ad0420ecc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.936155 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cff8-account-create-update-dnxld" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.936149 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cff8-account-create-update-dnxld" event={"ID":"dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee","Type":"ContainerDied","Data":"414e281ac73c56025b5de627f16f8d96eb509a13eb7f162526e998db86550696"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.937762 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414e281ac73c56025b5de627f16f8d96eb509a13eb7f162526e998db86550696" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.939084 4782 generic.go:334] "Generic (PLEG): container finished" podID="a8c361db-acc1-4cf3-aff9-f66661a2e327" containerID="5c7bdfb1bdb95d8fd0d941e12ab706c53dab9816c94d120ff51705f269413965" exitCode=0 Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.939133 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqdnd" event={"ID":"a8c361db-acc1-4cf3-aff9-f66661a2e327","Type":"ContainerDied","Data":"5c7bdfb1bdb95d8fd0d941e12ab706c53dab9816c94d120ff51705f269413965"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.939151 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqdnd" event={"ID":"a8c361db-acc1-4cf3-aff9-f66661a2e327","Type":"ContainerStarted","Data":"1fb18d19877b4150b3b80f8e79fd600f54391e38a222a7d27c9275c886ed46df"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.946081 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-48gwt" event={"ID":"fe1d4b19-02b2-4c3b-9e1a-eb9505b41503","Type":"ContainerDied","Data":"114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.946150 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114794ffcf14f771822d4debb1cf829a9b9b9a6e256060d088d271cd8c2f9efa" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.946263 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-48gwt" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.952659 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-430c-account-create-update-snr88" event={"ID":"d7c7a692-b10f-4799-bae7-8e2d88d26d22","Type":"ContainerDied","Data":"4b0425718ed3ab0164b51ced5546038f97c64859eae4f6254f5d7aa1f86aa8f7"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.952701 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b0425718ed3ab0164b51ced5546038f97c64859eae4f6254f5d7aa1f86aa8f7" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.952784 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-430c-account-create-update-snr88" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.974731 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vddgn" event={"ID":"8bcdb105-a248-440d-89b4-8c4ad0420ecc","Type":"ContainerDied","Data":"739e43083308a403e75f9b868076cbc8d6c539c23a465bbe298b9e8f9cb5f6eb"} Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.974768 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739e43083308a403e75f9b868076cbc8d6c539c23a465bbe298b9e8f9cb5f6eb" Jan 30 18:48:01 crc kubenswrapper[4782]: I0130 18:48:01.974835 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vddgn" Jan 30 18:48:02 crc kubenswrapper[4782]: W0130 18:48:02.060887 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ce13f8f_f62e_426f_b59b_6a15cb3237bb.slice/crio-5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2 WatchSource:0}: Error finding container 5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2: Status 404 returned error can't find the container with id 5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2 Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.988867 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"18e2d5468588b42302be1df133f791538df100e67a17d54de7e6d5f6b9b72f0b"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.989154 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"0b2261b04686e2884f5a57265c8a49e2286bc718ec4f44f9d67e25a32db00668"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.991264 4782 generic.go:334] "Generic (PLEG): container finished" podID="0f488b46-84ff-405e-abbe-87d54f15a5f5" containerID="e12bf6139dea373825462c63db019c3c88d79fb991c65b2d2912268e4bedd630" exitCode=0 Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.991303 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cac0-account-create-update-khnrr" event={"ID":"0f488b46-84ff-405e-abbe-87d54f15a5f5","Type":"ContainerDied","Data":"e12bf6139dea373825462c63db019c3c88d79fb991c65b2d2912268e4bedd630"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.991467 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cac0-account-create-update-khnrr" event={"ID":"0f488b46-84ff-405e-abbe-87d54f15a5f5","Type":"ContainerStarted","Data":"ea41b380d2e6e19b063fd0b9cd39645e44793f33a578485d6373bff112d3054c"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.994018 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-lnnb7" event={"ID":"5aa667b9-2c1d-4219-9f23-666323d7f509","Type":"ContainerStarted","Data":"7e2809ba214680e73e8f6eec482f15f6fe945528157eef45ae40cba0207cb0bf"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.995555 4782 generic.go:334] "Generic (PLEG): container finished" podID="7ce13f8f-f62e-426f-b59b-6a15cb3237bb" containerID="c62d4f14221ba5405a0d83b0d4b68cdf23119cd81602a31b59ea10e455587c48" exitCode=0 Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.995656 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5421-account-create-update-5nkvt" event={"ID":"7ce13f8f-f62e-426f-b59b-6a15cb3237bb","Type":"ContainerDied","Data":"c62d4f14221ba5405a0d83b0d4b68cdf23119cd81602a31b59ea10e455587c48"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.995679 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5421-account-create-update-5nkvt" event={"ID":"7ce13f8f-f62e-426f-b59b-6a15cb3237bb","Type":"ContainerStarted","Data":"5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.997419 4782 generic.go:334] "Generic (PLEG): container finished" podID="69d2e1f3-f06f-44b0-81a0-9b1117aea096" containerID="63f5072ad71d5e389523e43b61f20d7de63e0a51efa92955161b53c1927f0abf" exitCode=0 Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.997637 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rkfpr" event={"ID":"69d2e1f3-f06f-44b0-81a0-9b1117aea096","Type":"ContainerDied","Data":"63f5072ad71d5e389523e43b61f20d7de63e0a51efa92955161b53c1927f0abf"} Jan 30 18:48:02 crc kubenswrapper[4782]: I0130 18:48:02.997664 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rkfpr" event={"ID":"69d2e1f3-f06f-44b0-81a0-9b1117aea096","Type":"ContainerStarted","Data":"543f2bb396dcd6f91942cf33566a0438c98faacb09269dc20dc8c0043d865ed6"} Jan 30 18:48:03 crc kubenswrapper[4782]: I0130 18:48:03.437939 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:03 crc kubenswrapper[4782]: I0130 18:48:03.438616 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="config-reloader" containerID="cri-o://6340af41e50b610f37ae4f20cbcae1c5bf4ef053a6c7cc71c76a6438f00b6848" gracePeriod=600 Jan 30 18:48:03 crc kubenswrapper[4782]: I0130 18:48:03.438725 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" containerID="cri-o://d53c1f8b8401bd826cb739193815637f91833604fce2905f3e4b49162f0da4ef" gracePeriod=600 Jan 30 18:48:03 crc kubenswrapper[4782]: I0130 18:48:03.438725 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="thanos-sidecar" containerID="cri-o://e060c5ad3fd1db961e0960e03d13a7b2f1c40090cda4dc28fea7a058dfb1eaa2" gracePeriod=600 Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008420 4782 generic.go:334] "Generic (PLEG): container finished" podID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerID="d53c1f8b8401bd826cb739193815637f91833604fce2905f3e4b49162f0da4ef" exitCode=0 Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008449 4782 generic.go:334] "Generic (PLEG): container finished" podID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerID="e060c5ad3fd1db961e0960e03d13a7b2f1c40090cda4dc28fea7a058dfb1eaa2" exitCode=0 Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008456 4782 generic.go:334] "Generic (PLEG): container finished" podID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerID="6340af41e50b610f37ae4f20cbcae1c5bf4ef053a6c7cc71c76a6438f00b6848" exitCode=0 Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008588 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerDied","Data":"d53c1f8b8401bd826cb739193815637f91833604fce2905f3e4b49162f0da4ef"} Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008614 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerDied","Data":"e060c5ad3fd1db961e0960e03d13a7b2f1c40090cda4dc28fea7a058dfb1eaa2"} Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.008624 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerDied","Data":"6340af41e50b610f37ae4f20cbcae1c5bf4ef053a6c7cc71c76a6438f00b6848"} Jan 30 18:48:04 crc kubenswrapper[4782]: I0130 18:48:04.530383 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.530128 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.850174 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.884895 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.886423 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.889559 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.955851 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxfzr\" (UniqueName: \"kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr\") pod \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.955943 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgll\" (UniqueName: \"kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll\") pod \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.955975 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts\") pod \"a8c361db-acc1-4cf3-aff9-f66661a2e327\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.956066 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts\") pod \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\" (UID: \"69d2e1f3-f06f-44b0-81a0-9b1117aea096\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.956111 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzqgj\" (UniqueName: \"kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj\") pod \"a8c361db-acc1-4cf3-aff9-f66661a2e327\" (UID: \"a8c361db-acc1-4cf3-aff9-f66661a2e327\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.956146 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts\") pod \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\" (UID: \"7ce13f8f-f62e-426f-b59b-6a15cb3237bb\") " Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.956542 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8c361db-acc1-4cf3-aff9-f66661a2e327" (UID: "a8c361db-acc1-4cf3-aff9-f66661a2e327"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.956882 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ce13f8f-f62e-426f-b59b-6a15cb3237bb" (UID: "7ce13f8f-f62e-426f-b59b-6a15cb3237bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.957140 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69d2e1f3-f06f-44b0-81a0-9b1117aea096" (UID: "69d2e1f3-f06f-44b0-81a0-9b1117aea096"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.962094 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll" (OuterVolumeSpecName: "kube-api-access-xfgll") pod "69d2e1f3-f06f-44b0-81a0-9b1117aea096" (UID: "69d2e1f3-f06f-44b0-81a0-9b1117aea096"). InnerVolumeSpecName "kube-api-access-xfgll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.962305 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr" (OuterVolumeSpecName: "kube-api-access-lxfzr") pod "7ce13f8f-f62e-426f-b59b-6a15cb3237bb" (UID: "7ce13f8f-f62e-426f-b59b-6a15cb3237bb"). InnerVolumeSpecName "kube-api-access-lxfzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:09 crc kubenswrapper[4782]: I0130 18:48:09.971484 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj" (OuterVolumeSpecName: "kube-api-access-rzqgj") pod "a8c361db-acc1-4cf3-aff9-f66661a2e327" (UID: "a8c361db-acc1-4cf3-aff9-f66661a2e327"). InnerVolumeSpecName "kube-api-access-rzqgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058051 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwkzg\" (UniqueName: \"kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg\") pod \"0f488b46-84ff-405e-abbe-87d54f15a5f5\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058161 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts\") pod \"0f488b46-84ff-405e-abbe-87d54f15a5f5\" (UID: \"0f488b46-84ff-405e-abbe-87d54f15a5f5\") " Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058597 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxfzr\" (UniqueName: \"kubernetes.io/projected/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-kube-api-access-lxfzr\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058619 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgll\" (UniqueName: \"kubernetes.io/projected/69d2e1f3-f06f-44b0-81a0-9b1117aea096-kube-api-access-xfgll\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058629 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c361db-acc1-4cf3-aff9-f66661a2e327-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058638 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69d2e1f3-f06f-44b0-81a0-9b1117aea096-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058647 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzqgj\" (UniqueName: \"kubernetes.io/projected/a8c361db-acc1-4cf3-aff9-f66661a2e327-kube-api-access-rzqgj\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.058657 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ce13f8f-f62e-426f-b59b-6a15cb3237bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.059148 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f488b46-84ff-405e-abbe-87d54f15a5f5" (UID: "0f488b46-84ff-405e-abbe-87d54f15a5f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.059746 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rkfpr" event={"ID":"69d2e1f3-f06f-44b0-81a0-9b1117aea096","Type":"ContainerDied","Data":"543f2bb396dcd6f91942cf33566a0438c98faacb09269dc20dc8c0043d865ed6"} Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.059782 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="543f2bb396dcd6f91942cf33566a0438c98faacb09269dc20dc8c0043d865ed6" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.059810 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rkfpr" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.061641 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-cac0-account-create-update-khnrr" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.061666 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-cac0-account-create-update-khnrr" event={"ID":"0f488b46-84ff-405e-abbe-87d54f15a5f5","Type":"ContainerDied","Data":"ea41b380d2e6e19b063fd0b9cd39645e44793f33a578485d6373bff112d3054c"} Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.061703 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea41b380d2e6e19b063fd0b9cd39645e44793f33a578485d6373bff112d3054c" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.062125 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg" (OuterVolumeSpecName: "kube-api-access-pwkzg") pod "0f488b46-84ff-405e-abbe-87d54f15a5f5" (UID: "0f488b46-84ff-405e-abbe-87d54f15a5f5"). InnerVolumeSpecName "kube-api-access-pwkzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.063788 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lqdnd" event={"ID":"a8c361db-acc1-4cf3-aff9-f66661a2e327","Type":"ContainerDied","Data":"1fb18d19877b4150b3b80f8e79fd600f54391e38a222a7d27c9275c886ed46df"} Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.063826 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fb18d19877b4150b3b80f8e79fd600f54391e38a222a7d27c9275c886ed46df" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.063834 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lqdnd" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.065567 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5421-account-create-update-5nkvt" event={"ID":"7ce13f8f-f62e-426f-b59b-6a15cb3237bb","Type":"ContainerDied","Data":"5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2"} Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.065603 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca7cd2289a4704abdb21290d90e5d97dd91e7a3103e12b1a5f60ab5c11862e2" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.065621 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5421-account-create-update-5nkvt" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.159963 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwkzg\" (UniqueName: \"kubernetes.io/projected/0f488b46-84ff-405e-abbe-87d54f15a5f5-kube-api-access-pwkzg\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:10 crc kubenswrapper[4782]: I0130 18:48:10.159991 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f488b46-84ff-405e-abbe-87d54f15a5f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.151446 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317260 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317343 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317377 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdmb7\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317417 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317459 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317541 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317571 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317751 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317801 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317822 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.317966 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config\") pod \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\" (UID: \"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c\") " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.318050 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.318272 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.318284 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.318559 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.324125 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out" (OuterVolumeSpecName: "config-out") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.324594 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7" (OuterVolumeSpecName: "kube-api-access-cdmb7") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "kube-api-access-cdmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.334423 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.338253 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.340535 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config" (OuterVolumeSpecName: "config") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.342543 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "pvc-10319900-5721-45e2-9485-947fcfd18ab3". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.360416 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config" (OuterVolumeSpecName: "web-config") pod "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" (UID: "373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.419288 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.419644 4782 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.419697 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdmb7\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-kube-api-access-cdmb7\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.420089 4782 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.420105 4782 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.420117 4782 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.420142 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") on node \"crc\" " Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.420155 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.442518 4782 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.442698 4782 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10319900-5721-45e2-9485-947fcfd18ab3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3") on node "crc" Jan 30 18:48:11 crc kubenswrapper[4782]: I0130 18:48:11.521497 4782 reconciler_common.go:293] "Volume detached for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.099546 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c","Type":"ContainerDied","Data":"370131147ef6e64d8cc01a054b6ba833450a2395f218959eb61e3d72ef39c46d"} Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.099628 4782 scope.go:117] "RemoveContainer" containerID="d53c1f8b8401bd826cb739193815637f91833604fce2905f3e4b49162f0da4ef" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.099725 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.145524 4782 scope.go:117] "RemoveContainer" containerID="e060c5ad3fd1db961e0960e03d13a7b2f1c40090cda4dc28fea7a058dfb1eaa2" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.176208 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.181805 4782 scope.go:117] "RemoveContainer" containerID="6340af41e50b610f37ae4f20cbcae1c5bf4ef053a6c7cc71c76a6438f00b6848" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.190441 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.210351 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.210932 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f488b46-84ff-405e-abbe-87d54f15a5f5" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211006 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f488b46-84ff-405e-abbe-87d54f15a5f5" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211095 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcdb105-a248-440d-89b4-8c4ad0420ecc" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211170 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcdb105-a248-440d-89b4-8c4ad0420ecc" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211400 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211494 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211564 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211646 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211718 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d2e1f3-f06f-44b0-81a0-9b1117aea096" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211788 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d2e1f3-f06f-44b0-81a0-9b1117aea096" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211849 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="init-config-reloader" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.211913 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="init-config-reloader" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.211982 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce13f8f-f62e-426f-b59b-6a15cb3237bb" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.212045 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce13f8f-f62e-426f-b59b-6a15cb3237bb" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.212113 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.212176 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.212273 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="config-reloader" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.212344 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="config-reloader" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.212411 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c7a692-b10f-4799-bae7-8e2d88d26d22" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.212467 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c7a692-b10f-4799-bae7-8e2d88d26d22" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.212522 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="thanos-sidecar" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.219105 4782 scope.go:117] "RemoveContainer" containerID="e8e79982a7d959454bc43b52f926e3a06b057e89684c1a809e35d6e8fa448449" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.219128 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="thanos-sidecar" Jan 30 18:48:12 crc kubenswrapper[4782]: E0130 18:48:12.219327 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c361db-acc1-4cf3-aff9-f66661a2e327" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.219355 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c361db-acc1-4cf3-aff9-f66661a2e327" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223531 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223597 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c7a692-b10f-4799-bae7-8e2d88d26d22" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223617 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d2e1f3-f06f-44b0-81a0-9b1117aea096" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223642 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce13f8f-f62e-426f-b59b-6a15cb3237bb" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223660 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c361db-acc1-4cf3-aff9-f66661a2e327" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223707 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="config-reloader" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223722 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f488b46-84ff-405e-abbe-87d54f15a5f5" containerName="mariadb-account-create-update" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223741 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="prometheus" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223751 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcdb105-a248-440d-89b4-8c4ad0420ecc" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223765 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" containerName="mariadb-database-create" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.223787 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" containerName="thanos-sidecar" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.234666 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.239473 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.260795 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261087 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261252 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261562 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261818 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261939 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.261976 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.262785 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4h9n8" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.265289 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.336835 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337118 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337261 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337626 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337679 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337788 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd96\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337846 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337881 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337914 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.337942 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.338028 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.338096 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.338123 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.420646 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c" path="/var/lib/kubelet/pods/373ccd1c-b7bf-4ba1-92c2-37d1ca11e45c/volumes" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439297 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439353 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439395 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd96\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439421 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439442 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439465 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439481 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439498 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439520 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439536 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439583 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439613 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.439640 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.440516 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.440932 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.441128 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.443637 4782 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.443707 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7160aa072a4d7b723e1d7d729bf0a8cb68e388f5eb2d179074e77609eba87da8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.445745 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.446002 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.447045 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.447474 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.449211 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.451596 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.452082 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.453399 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.458646 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd96\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.477020 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:12 crc kubenswrapper[4782]: I0130 18:48:12.604550 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 18:48:13 crc kubenswrapper[4782]: I0130 18:48:13.111872 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"6022b49904f8d446a39a1b79b1806c0ebfa3820d33cff4c650f24c98fff503f1"} Jan 30 18:48:13 crc kubenswrapper[4782]: I0130 18:48:13.121327 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4cx2" event={"ID":"16e94436-5c08-4fa9-8e93-8929251269ff","Type":"ContainerStarted","Data":"d4d72ca3e929937c0055d22ab8a6d716ebb4e8e358a6caa2a13cec5ae25e6262"} Jan 30 18:48:13 crc kubenswrapper[4782]: I0130 18:48:13.148068 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c4cx2" podStartSLOduration=4.022697115 podStartE2EDuration="16.148046949s" podCreationTimestamp="2026-01-30 18:47:57 +0000 UTC" firstStartedPulling="2026-01-30 18:47:59.228096135 +0000 UTC m=+1055.496474160" lastFinishedPulling="2026-01-30 18:48:11.353445959 +0000 UTC m=+1067.621823994" observedRunningTime="2026-01-30 18:48:13.144370788 +0000 UTC m=+1069.412748813" watchObservedRunningTime="2026-01-30 18:48:13.148046949 +0000 UTC m=+1069.416424964" Jan 30 18:48:13 crc kubenswrapper[4782]: I0130 18:48:13.837728 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 18:48:14 crc kubenswrapper[4782]: I0130 18:48:14.137758 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-lnnb7" event={"ID":"5aa667b9-2c1d-4219-9f23-666323d7f509","Type":"ContainerStarted","Data":"eef16c5dad318cdaffe998186a3495b644a699b58c83f4c776a66d1dbd763684"} Jan 30 18:48:14 crc kubenswrapper[4782]: I0130 18:48:14.155377 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"6307f1a35efd91ef5aa8c95f931ac32bf49ab1727884857e3de01d33131c2982"} Jan 30 18:48:14 crc kubenswrapper[4782]: I0130 18:48:14.155418 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"7120b408da5042a179dc19f20ec68a5d68e1991554ebeec68121dabfbbb9cc06"} Jan 30 18:48:14 crc kubenswrapper[4782]: I0130 18:48:14.158681 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerStarted","Data":"97bfae0737ec94e8ed8db19019eb512491cf7b03968d31c7b7717785d3817977"} Jan 30 18:48:14 crc kubenswrapper[4782]: I0130 18:48:14.166534 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-lnnb7" podStartSLOduration=3.275796443 podStartE2EDuration="14.166507851s" podCreationTimestamp="2026-01-30 18:48:00 +0000 UTC" firstStartedPulling="2026-01-30 18:48:02.066074713 +0000 UTC m=+1058.334452738" lastFinishedPulling="2026-01-30 18:48:12.956786081 +0000 UTC m=+1069.225164146" observedRunningTime="2026-01-30 18:48:14.159048206 +0000 UTC m=+1070.427426261" watchObservedRunningTime="2026-01-30 18:48:14.166507851 +0000 UTC m=+1070.434885906" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.172319 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"84a676a80f35103543e7cd38ac879bb101d993c2037a66c65d7eda1a2039a53d"} Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.172586 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"4094acc1e694970d27e64d65e923f071b156a9aad3557ab2c19f2d0dce40aee5"} Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.172596 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"a1f868043cea96273929071a974bfb09c9e85e15701d895cdae0b23e550dc191"} Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.172604 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"03578ac049dd86010f19461f5b1d1d7324324f53e468ff439ab7f5eebfa9147f"} Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.326898 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-bpwp5"] Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.328030 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.334318 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.334577 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwwlx" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.342611 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bpwp5"] Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.433073 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.433154 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.433277 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.433323 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7f88\" (UniqueName: \"kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.534714 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7f88\" (UniqueName: \"kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.534792 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.534826 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.535151 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.575751 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.576969 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.577464 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7f88\" (UniqueName: \"kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.579376 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle\") pod \"glance-db-sync-bpwp5\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:15 crc kubenswrapper[4782]: I0130 18:48:15.646857 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bpwp5" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.185676 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"eb1c727313c52c45d1d0e828f983d0fde01440bbae90c710c289a94971e24a61"} Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.186002 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15ef1358-db2b-4935-b53c-7aad2613cee7","Type":"ContainerStarted","Data":"ea3bb1054d283435bac53b13abab8a7d7450a26a40e4126016dd4531f2ee4aec"} Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.232333 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.029769524 podStartE2EDuration="53.232291868s" podCreationTimestamp="2026-01-30 18:47:23 +0000 UTC" firstStartedPulling="2026-01-30 18:47:57.591416537 +0000 UTC m=+1053.859794562" lastFinishedPulling="2026-01-30 18:48:13.793938891 +0000 UTC m=+1070.062316906" observedRunningTime="2026-01-30 18:48:16.230292669 +0000 UTC m=+1072.498670724" watchObservedRunningTime="2026-01-30 18:48:16.232291868 +0000 UTC m=+1072.500669893" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.367305 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bpwp5"] Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.542559 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.544708 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.547000 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.552318 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.664923 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.664996 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.665030 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.665067 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.665093 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7js\" (UniqueName: \"kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.665493 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.768787 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.768864 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.768909 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.768947 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.768996 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.769020 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7js\" (UniqueName: \"kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.769968 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.770157 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.770196 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.770536 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.770666 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.798924 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7js\" (UniqueName: \"kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js\") pod \"dnsmasq-dns-658798c9cc-hdk8g\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:16 crc kubenswrapper[4782]: I0130 18:48:16.862634 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:17 crc kubenswrapper[4782]: I0130 18:48:17.227210 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bpwp5" event={"ID":"07adaf47-0b0c-46f9-bf42-fc02ffec87a4","Type":"ContainerStarted","Data":"11aad5f2e222830f361eb66056d34b6dcbb23a1114d60f97f0eb522e07453b61"} Jan 30 18:48:17 crc kubenswrapper[4782]: I0130 18:48:17.368972 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:17 crc kubenswrapper[4782]: W0130 18:48:17.389021 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac812687_eb03_4784_ad8f_ff91ebd39d28.slice/crio-a33d4478d98375bfc760061cc0d9d00f80bf717f04af79808219696e7ae6cfce WatchSource:0}: Error finding container a33d4478d98375bfc760061cc0d9d00f80bf717f04af79808219696e7ae6cfce: Status 404 returned error can't find the container with id a33d4478d98375bfc760061cc0d9d00f80bf717f04af79808219696e7ae6cfce Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.239296 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerStarted","Data":"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555"} Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.245830 4782 generic.go:334] "Generic (PLEG): container finished" podID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerID="440897237abf64b329282b2bedba4af81f1f7437554cfd73f1472646b45046ee" exitCode=0 Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.245921 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" event={"ID":"ac812687-eb03-4784-ad8f-ff91ebd39d28","Type":"ContainerDied","Data":"440897237abf64b329282b2bedba4af81f1f7437554cfd73f1472646b45046ee"} Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.245977 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" event={"ID":"ac812687-eb03-4784-ad8f-ff91ebd39d28","Type":"ContainerStarted","Data":"a33d4478d98375bfc760061cc0d9d00f80bf717f04af79808219696e7ae6cfce"} Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.247814 4782 generic.go:334] "Generic (PLEG): container finished" podID="5aa667b9-2c1d-4219-9f23-666323d7f509" containerID="eef16c5dad318cdaffe998186a3495b644a699b58c83f4c776a66d1dbd763684" exitCode=0 Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.247931 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-lnnb7" event={"ID":"5aa667b9-2c1d-4219-9f23-666323d7f509","Type":"ContainerDied","Data":"eef16c5dad318cdaffe998186a3495b644a699b58c83f4c776a66d1dbd763684"} Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.254633 4782 generic.go:334] "Generic (PLEG): container finished" podID="16e94436-5c08-4fa9-8e93-8929251269ff" containerID="d4d72ca3e929937c0055d22ab8a6d716ebb4e8e358a6caa2a13cec5ae25e6262" exitCode=0 Jan 30 18:48:18 crc kubenswrapper[4782]: I0130 18:48:18.254673 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4cx2" event={"ID":"16e94436-5c08-4fa9-8e93-8929251269ff","Type":"ContainerDied","Data":"d4d72ca3e929937c0055d22ab8a6d716ebb4e8e358a6caa2a13cec5ae25e6262"} Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.284646 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" event={"ID":"ac812687-eb03-4784-ad8f-ff91ebd39d28","Type":"ContainerStarted","Data":"97bbefef290e953f0a9f2ceb37abae11f699d1bfdf4438a148cb27510245ffca"} Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.312431 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" podStartSLOduration=3.312406485 podStartE2EDuration="3.312406485s" podCreationTimestamp="2026-01-30 18:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:48:19.306052798 +0000 UTC m=+1075.574430833" watchObservedRunningTime="2026-01-30 18:48:19.312406485 +0000 UTC m=+1075.580784520" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.719370 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.724877 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828289 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle\") pod \"5aa667b9-2c1d-4219-9f23-666323d7f509\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828399 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8kws\" (UniqueName: \"kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws\") pod \"5aa667b9-2c1d-4219-9f23-666323d7f509\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828437 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle\") pod \"16e94436-5c08-4fa9-8e93-8929251269ff\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828466 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data\") pod \"5aa667b9-2c1d-4219-9f23-666323d7f509\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828499 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data\") pod \"16e94436-5c08-4fa9-8e93-8929251269ff\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828619 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blrl4\" (UniqueName: \"kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4\") pod \"16e94436-5c08-4fa9-8e93-8929251269ff\" (UID: \"16e94436-5c08-4fa9-8e93-8929251269ff\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.828671 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data\") pod \"5aa667b9-2c1d-4219-9f23-666323d7f509\" (UID: \"5aa667b9-2c1d-4219-9f23-666323d7f509\") " Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.834463 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4" (OuterVolumeSpecName: "kube-api-access-blrl4") pod "16e94436-5c08-4fa9-8e93-8929251269ff" (UID: "16e94436-5c08-4fa9-8e93-8929251269ff"). InnerVolumeSpecName "kube-api-access-blrl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.834608 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws" (OuterVolumeSpecName: "kube-api-access-g8kws") pod "5aa667b9-2c1d-4219-9f23-666323d7f509" (UID: "5aa667b9-2c1d-4219-9f23-666323d7f509"). InnerVolumeSpecName "kube-api-access-g8kws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.837922 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5aa667b9-2c1d-4219-9f23-666323d7f509" (UID: "5aa667b9-2c1d-4219-9f23-666323d7f509"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.860334 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5aa667b9-2c1d-4219-9f23-666323d7f509" (UID: "5aa667b9-2c1d-4219-9f23-666323d7f509"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.860377 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16e94436-5c08-4fa9-8e93-8929251269ff" (UID: "16e94436-5c08-4fa9-8e93-8929251269ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.879090 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data" (OuterVolumeSpecName: "config-data") pod "16e94436-5c08-4fa9-8e93-8929251269ff" (UID: "16e94436-5c08-4fa9-8e93-8929251269ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.889477 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data" (OuterVolumeSpecName: "config-data") pod "5aa667b9-2c1d-4219-9f23-666323d7f509" (UID: "5aa667b9-2c1d-4219-9f23-666323d7f509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.931959 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blrl4\" (UniqueName: \"kubernetes.io/projected/16e94436-5c08-4fa9-8e93-8929251269ff-kube-api-access-blrl4\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.931996 4782 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.932005 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.932014 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8kws\" (UniqueName: \"kubernetes.io/projected/5aa667b9-2c1d-4219-9f23-666323d7f509-kube-api-access-g8kws\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.932023 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.932034 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aa667b9-2c1d-4219-9f23-666323d7f509-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:19 crc kubenswrapper[4782]: I0130 18:48:19.932044 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e94436-5c08-4fa9-8e93-8929251269ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.302769 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-lnnb7" event={"ID":"5aa667b9-2c1d-4219-9f23-666323d7f509","Type":"ContainerDied","Data":"7e2809ba214680e73e8f6eec482f15f6fe945528157eef45ae40cba0207cb0bf"} Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.303117 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e2809ba214680e73e8f6eec482f15f6fe945528157eef45ae40cba0207cb0bf" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.302834 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-lnnb7" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.305841 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c4cx2" event={"ID":"16e94436-5c08-4fa9-8e93-8929251269ff","Type":"ContainerDied","Data":"c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948"} Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.305907 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91f712f873dfc4e351eff0bb2ea52df3a47a9df198e920ea7c3bf5c06a6e948" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.306061 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c4cx2" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.306068 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.581342 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.605503 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:20 crc kubenswrapper[4782]: E0130 18:48:20.605819 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e94436-5c08-4fa9-8e93-8929251269ff" containerName="keystone-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.605834 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e94436-5c08-4fa9-8e93-8929251269ff" containerName="keystone-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: E0130 18:48:20.605860 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa667b9-2c1d-4219-9f23-666323d7f509" containerName="watcher-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.605867 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa667b9-2c1d-4219-9f23-666323d7f509" containerName="watcher-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.606034 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa667b9-2c1d-4219-9f23-666323d7f509" containerName="watcher-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.606051 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e94436-5c08-4fa9-8e93-8929251269ff" containerName="keystone-db-sync" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.606839 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.631281 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nfrgc"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.633166 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.639139 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.642521 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.643492 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.644050 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.644112 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7d8px" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650565 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650623 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650644 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn69f\" (UniqueName: \"kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650679 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650712 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650729 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650745 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650766 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650791 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650813 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650834 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650864 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpcd\" (UniqueName: \"kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.650983 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfrgc"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751459 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn69f\" (UniqueName: \"kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751497 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751528 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751560 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751577 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751596 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751625 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751647 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751666 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751687 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751725 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpcd\" (UniqueName: \"kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.751754 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.753335 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.756260 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.756959 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.757678 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.758354 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.759477 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.762962 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.765718 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.766594 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.780770 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.802198 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpcd\" (UniqueName: \"kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd\") pod \"dnsmasq-dns-7bf5f74b79-dx87h\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.839508 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.909913 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.918835 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.942945 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-pvpzs" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.943507 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.953818 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.953865 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.953918 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55nnt\" (UniqueName: \"kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.953944 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.953977 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.969319 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:48:20 crc kubenswrapper[4782]: I0130 18:48:20.988651 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.056781 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.056848 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.056934 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.056954 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.057015 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55nnt\" (UniqueName: \"kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.058253 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.062483 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.064337 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.067930 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.068435 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.068481 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn69f\" (UniqueName: \"kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f\") pod \"keystone-bootstrap-nfrgc\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.093554 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.095610 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.107539 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.109208 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.129980 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-5nqzr" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.130090 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.130199 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.130372 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.144947 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55nnt\" (UniqueName: \"kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt\") pod \"watcher-api-0\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.150458 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.193420 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.243468 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mpvf8"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.249411 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.255730 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.262740 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmkk\" (UniqueName: \"kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.262875 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.262945 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.263051 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxqc\" (UniqueName: \"kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.263193 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.263307 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.264723 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zpd22" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.264907 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q8pml"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.264944 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.265278 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.265383 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.265448 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.265701 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.266129 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.270949 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.271102 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nggnw" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.271205 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.291648 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.331668 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.351051 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383059 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383107 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383175 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzts\" (UniqueName: \"kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383207 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383234 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383271 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383301 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmkk\" (UniqueName: \"kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.383685 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.385974 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.394183 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.395070 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.397393 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.397574 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398073 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398139 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxqc\" (UniqueName: \"kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398173 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398274 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398589 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398664 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.398726 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vxdc\" (UniqueName: \"kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.399088 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.399389 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.454361 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.458862 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q8pml"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.467360 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.467760 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.478211 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.489127 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmkk\" (UniqueName: \"kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.494892 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data\") pod \"watcher-applier-0\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.516325 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mpvf8"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.534137 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxqc\" (UniqueName: \"kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc\") pod \"horizon-bbf49fbcf-9tf5s\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549197 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549260 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549304 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzpx5\" (UniqueName: \"kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549325 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549348 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vxdc\" (UniqueName: \"kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549373 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549392 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549408 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549455 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549475 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzts\" (UniqueName: \"kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549498 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549544 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549580 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.549674 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.564411 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.569246 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.570990 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.571332 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.571376 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.577402 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzts\" (UniqueName: \"kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts\") pod \"neutron-db-sync-q8pml\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.582055 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.582405 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vxdc\" (UniqueName: \"kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc\") pod \"cinder-db-sync-mpvf8\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.609308 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-p9dwp"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.610724 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.614389 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.616453 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.619130 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zdhpz" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.643156 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p9dwp"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.651140 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.651409 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzpx5\" (UniqueName: \"kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.651476 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.651506 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.651554 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.656296 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.658260 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.670754 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.672795 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.682291 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-f7fq6"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.683309 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.683451 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.686628 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.686826 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t8tvj" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.695539 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzpx5\" (UniqueName: \"kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5\") pod \"watcher-decision-engine-0\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.696591 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.702494 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.710471 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.716085 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.716302 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.719945 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f7fq6"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.736088 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.739009 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.749863 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.754216 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.754297 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl77v\" (UniqueName: \"kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.754754 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.754858 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.754978 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.760297 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.763863 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.777478 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.779457 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.793375 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q8pml" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.808421 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.808818 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858826 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858875 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858893 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858917 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858947 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858965 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858981 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.858998 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859027 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859043 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859064 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859087 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859112 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqb9b\" (UniqueName: \"kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859132 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnd6t\" (UniqueName: \"kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859146 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shsqj\" (UniqueName: \"kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859170 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859190 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859205 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859222 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859250 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859281 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859299 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859316 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859340 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zqs\" (UniqueName: \"kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859372 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.859400 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl77v\" (UniqueName: \"kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.866308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.867373 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.867641 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.875248 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.882916 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.907010 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl77v\" (UniqueName: \"kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v\") pod \"placement-db-sync-p9dwp\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965104 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965387 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965408 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965429 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965465 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965500 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965518 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965534 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965559 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965575 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965595 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965624 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqb9b\" (UniqueName: \"kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965645 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnd6t\" (UniqueName: \"kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965660 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shsqj\" (UniqueName: \"kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965682 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965705 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965721 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965749 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965766 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965786 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.965803 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zqs\" (UniqueName: \"kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.966754 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.967151 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.967700 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.967728 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.967988 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.968163 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.968580 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.969752 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.969985 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.970382 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.987082 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.987519 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.987780 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.990787 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnd6t\" (UniqueName: \"kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.994141 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.997006 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.997033 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key\") pod \"horizon-5d88448fc-75xln\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:21 crc kubenswrapper[4782]: I0130 18:48:21.998831 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.006051 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shsqj\" (UniqueName: \"kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj\") pod \"dnsmasq-dns-5cc4f6997f-vmkbp\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.006063 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zqs\" (UniqueName: \"kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs\") pod \"ceilometer-0\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " pod="openstack/ceilometer-0" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.006554 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqb9b\" (UniqueName: \"kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.009672 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle\") pod \"barbican-db-sync-f7fq6\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:22 crc kubenswrapper[4782]: W0130 18:48:22.035469 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec7672b9_b70c_4a7e_bda0_e880b0b57555.slice/crio-55f393ff1120f7b79ac4fdeb64fdcbbaca1dd2fbc8d8340fe10ebbf07d205692 WatchSource:0}: Error finding container 55f393ff1120f7b79ac4fdeb64fdcbbaca1dd2fbc8d8340fe10ebbf07d205692: Status 404 returned error can't find the container with id 55f393ff1120f7b79ac4fdeb64fdcbbaca1dd2fbc8d8340fe10ebbf07d205692 Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.365521 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p9dwp" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.367406 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.398674 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.400157 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.401320 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.452410 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfrgc"] Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.452442 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.502353 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfrgc" event={"ID":"8d9deb04-4e5b-4f44-990d-9c2c0ce06443","Type":"ContainerStarted","Data":"009edf0f86365ed08c912458866317b8f78a00655ca9bc877defe39a8922a130"} Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.505177 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" event={"ID":"ec7672b9-b70c-4a7e-bda0-e880b0b57555","Type":"ContainerStarted","Data":"55f393ff1120f7b79ac4fdeb64fdcbbaca1dd2fbc8d8340fe10ebbf07d205692"} Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.507529 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerStarted","Data":"d505a37877bdf0de7e5894e3860a68a3756d552a9b6403605566a883203a9ac8"} Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.507722 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="dnsmasq-dns" containerID="cri-o://97bbefef290e953f0a9f2ceb37abae11f699d1bfdf4438a148cb27510245ffca" gracePeriod=10 Jan 30 18:48:22 crc kubenswrapper[4782]: I0130 18:48:22.569929 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:48:22 crc kubenswrapper[4782]: W0130 18:48:22.714205 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db1b143_b7ce_4cc4_8412_6e3402508e98.slice/crio-4ec310566051baf8561483fe2226455733ca5f420213de022c0f54091e06ff46 WatchSource:0}: Error finding container 4ec310566051baf8561483fe2226455733ca5f420213de022c0f54091e06ff46: Status 404 returned error can't find the container with id 4ec310566051baf8561483fe2226455733ca5f420213de022c0f54091e06ff46 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.077106 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.144849 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.153732 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q8pml"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.202056 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mpvf8"] Jan 30 18:48:23 crc kubenswrapper[4782]: W0130 18:48:23.208709 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ffaa09_371e_4549_a56f_11d6734ff40e.slice/crio-37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057 WatchSource:0}: Error finding container 37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057: Status 404 returned error can't find the container with id 37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057 Jan 30 18:48:23 crc kubenswrapper[4782]: W0130 18:48:23.214492 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ddf9ab_a21e_4d41_b795_c0c926e38a1e.slice/crio-633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517 WatchSource:0}: Error finding container 633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517: Status 404 returned error can't find the container with id 633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517 Jan 30 18:48:23 crc kubenswrapper[4782]: W0130 18:48:23.499451 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13b1ca2_1722_425b_ae48_d34c99f746f6.slice/crio-11aea3b826fe2c4213c49072055b289505f153e9bc3daf420a1d7c622ec9058b WatchSource:0}: Error finding container 11aea3b826fe2c4213c49072055b289505f153e9bc3daf420a1d7c622ec9058b: Status 404 returned error can't find the container with id 11aea3b826fe2c4213c49072055b289505f153e9bc3daf420a1d7c622ec9058b Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.500087 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.575487 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.592484 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"627f3bc8-cc79-4500-81d4-4dc041b88394","Type":"ContainerStarted","Data":"0376846e9068aaeba1b63273b3d13d482a8620b17252f42a5236b30d6b0e0bd3"} Jan 30 18:48:23 crc kubenswrapper[4782]: W0130 18:48:23.620326 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bf0d39e_8ee2_4500_b349_ea63b4df15d0.slice/crio-058f3ddba75bddd599412c7beb31e93d6e4d81acea96e4eb1179a8d319f11a87 WatchSource:0}: Error finding container 058f3ddba75bddd599412c7beb31e93d6e4d81acea96e4eb1179a8d319f11a87: Status 404 returned error can't find the container with id 058f3ddba75bddd599412c7beb31e93d6e4d81acea96e4eb1179a8d319f11a87 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.646633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfrgc" event={"ID":"8d9deb04-4e5b-4f44-990d-9c2c0ce06443","Type":"ContainerStarted","Data":"07829a177d4c5bb5f2e54de53ef57ae9464ada31d0843e33381de37958a880f6"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.662171 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.678413 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.699067 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q8pml" event={"ID":"63ffaa09-371e-4549-a56f-11d6734ff40e","Type":"ContainerStarted","Data":"ac7b4bd8674c634ebaaebc3c3cacd267e39e6541b520fc04a9b6032720f3ce1b"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.699107 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q8pml" event={"ID":"63ffaa09-371e-4549-a56f-11d6734ff40e","Type":"ContainerStarted","Data":"37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.742874 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-f7fq6"] Jan 30 18:48:23 crc kubenswrapper[4782]: W0130 18:48:23.753939 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60cc626f_3c26_4417_87f0_5000cdbaadda.slice/crio-4ee324cff937ca7e6b51908207a474bd16466e617a15c1cd7df120a4654a88b0 WatchSource:0}: Error finding container 4ee324cff937ca7e6b51908207a474bd16466e617a15c1cd7df120a4654a88b0: Status 404 returned error can't find the container with id 4ee324cff937ca7e6b51908207a474bd16466e617a15c1cd7df120a4654a88b0 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.791600 4782 generic.go:334] "Generic (PLEG): container finished" podID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" exitCode=0 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.792418 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-p9dwp"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.792467 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerDied","Data":"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.825015 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nfrgc" podStartSLOduration=3.824992431 podStartE2EDuration="3.824992431s" podCreationTimestamp="2026-01-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:48:23.689526425 +0000 UTC m=+1079.957904450" watchObservedRunningTime="2026-01-30 18:48:23.824992431 +0000 UTC m=+1080.093370456" Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.845598 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q8pml" podStartSLOduration=2.845580861 podStartE2EDuration="2.845580861s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:48:23.743641476 +0000 UTC m=+1080.012019501" watchObservedRunningTime="2026-01-30 18:48:23.845580861 +0000 UTC m=+1080.113958876" Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.857525 4782 generic.go:334] "Generic (PLEG): container finished" podID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerID="97bbefef290e953f0a9f2ceb37abae11f699d1bfdf4438a148cb27510245ffca" exitCode=0 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.857615 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" event={"ID":"ac812687-eb03-4784-ad8f-ff91ebd39d28","Type":"ContainerDied","Data":"97bbefef290e953f0a9f2ceb37abae11f699d1bfdf4438a148cb27510245ffca"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.898429 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.912342 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.941292 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.942762 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.952723 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" event={"ID":"ec7672b9-b70c-4a7e-bda0-e880b0b57555","Type":"ContainerStarted","Data":"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.952923 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" podUID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" containerName="init" containerID="cri-o://cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe" gracePeriod=10 Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.958594 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.970089 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mpvf8" event={"ID":"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e","Type":"ContainerStarted","Data":"633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517"} Jan 30 18:48:23 crc kubenswrapper[4782]: I0130 18:48:23.987588 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2db1b143-b7ce-4cc4-8412-6e3402508e98","Type":"ContainerStarted","Data":"4ec310566051baf8561483fe2226455733ca5f420213de022c0f54091e06ff46"} Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.002475 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerStarted","Data":"11aea3b826fe2c4213c49072055b289505f153e9bc3daf420a1d7c622ec9058b"} Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.023245 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbf49fbcf-9tf5s" event={"ID":"70bb6236-54ab-4d4b-8219-44b7fa48e716","Type":"ContainerStarted","Data":"f4dfebfac2b07084b6425c4fb7faf9bbc37197ef888f915a271ef080a1038db9"} Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.054687 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.054748 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.054773 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.054827 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvnqv\" (UniqueName: \"kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.054869 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.085973 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerStarted","Data":"5c3e8dbfd9edf846d7085f465be4ab99ea5db8a559a96dd0e769f75d444cf786"} Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.177332 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.177391 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.177417 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.177491 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvnqv\" (UniqueName: \"kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.177549 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.178301 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.178549 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.179377 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.205773 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.215691 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.216154 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvnqv\" (UniqueName: \"kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv\") pod \"horizon-64c9484657-zqf7j\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.283981 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.284078 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg7js\" (UniqueName: \"kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.284132 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.284179 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.284212 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.284295 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc\") pod \"ac812687-eb03-4784-ad8f-ff91ebd39d28\" (UID: \"ac812687-eb03-4784-ad8f-ff91ebd39d28\") " Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.292670 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js" (OuterVolumeSpecName: "kube-api-access-jg7js") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "kube-api-access-jg7js". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.307860 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.387818 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg7js\" (UniqueName: \"kubernetes.io/projected/ac812687-eb03-4784-ad8f-ff91ebd39d28-kube-api-access-jg7js\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.707858 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.710987 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.774594 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.774986 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config" (OuterVolumeSpecName: "config") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.793943 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.812544 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.812578 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.812587 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.852730 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac812687-eb03-4784-ad8f-ff91ebd39d28" (UID: "ac812687-eb03-4784-ad8f-ff91ebd39d28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.907792 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.914058 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac812687-eb03-4784-ad8f-ff91ebd39d28-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:24 crc kubenswrapper[4782]: I0130 18:48:24.975444 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.115972 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.116043 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.116131 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.116198 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.116250 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcpcd\" (UniqueName: \"kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.116361 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc\") pod \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\" (UID: \"ec7672b9-b70c-4a7e-bda0-e880b0b57555\") " Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.123207 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerStarted","Data":"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.123279 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerStarted","Data":"4ee324cff937ca7e6b51908207a474bd16466e617a15c1cd7df120a4654a88b0"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.132679 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p9dwp" event={"ID":"c0087b5d-ef94-4433-9e0e-23b509dd3003","Type":"ContainerStarted","Data":"90c7ce74ee5af845fea5c9b9b2ea323fef7eef6e7c0b39cb26ec3e381c7a71f5"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.138916 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64c9484657-zqf7j" event={"ID":"799a6410-8ee6-42c6-9336-c2f84d59b724","Type":"ContainerStarted","Data":"c3593caf472d8a22294c62766e1f564ee175ed6ca88e17ef7f7d9353c2a071b7"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.149391 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f7fq6" event={"ID":"47aa0756-718b-4d1c-bef5-318895ee6c90","Type":"ContainerStarted","Data":"5ce6a51b97e0f301f087c16079800082fc48fc8ecc3725fe5dc7655905990dd0"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.177465 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd" (OuterVolumeSpecName: "kube-api-access-fcpcd") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "kube-api-access-fcpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.181243 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.188162 4782 generic.go:334] "Generic (PLEG): container finished" podID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" containerID="cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe" exitCode=0 Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.188465 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" event={"ID":"ec7672b9-b70c-4a7e-bda0-e880b0b57555","Type":"ContainerDied","Data":"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.189458 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.191280 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bf5f74b79-dx87h" event={"ID":"ec7672b9-b70c-4a7e-bda0-e880b0b57555","Type":"ContainerDied","Data":"55f393ff1120f7b79ac4fdeb64fdcbbaca1dd2fbc8d8340fe10ebbf07d205692"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.191318 4782 scope.go:117] "RemoveContainer" containerID="cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.194153 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerStarted","Data":"62130a5829f203b8ec1cd565b6fea1c87394f2712df2e49778f96a1795ddcfd4"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.194895 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api-log" containerID="cri-o://5c3e8dbfd9edf846d7085f465be4ab99ea5db8a559a96dd0e769f75d444cf786" gracePeriod=30 Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.194971 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" containerID="cri-o://62130a5829f203b8ec1cd565b6fea1c87394f2712df2e49778f96a1795ddcfd4" gracePeriod=30 Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.195154 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.200802 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.204823 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d88448fc-75xln" event={"ID":"3bf0d39e-8ee2-4500-b349-ea63b4df15d0","Type":"ContainerStarted","Data":"058f3ddba75bddd599412c7beb31e93d6e4d81acea96e4eb1179a8d319f11a87"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.207867 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config" (OuterVolumeSpecName: "config") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.214690 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.222827 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerStarted","Data":"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.223199 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": EOF" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.225571 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.225597 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcpcd\" (UniqueName: \"kubernetes.io/projected/ec7672b9-b70c-4a7e-bda0-e880b0b57555-kube-api-access-fcpcd\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.225609 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.225618 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.225632 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.235962 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.235937156 podStartE2EDuration="5.235937156s" podCreationTimestamp="2026-01-30 18:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:48:25.219910619 +0000 UTC m=+1081.488288654" watchObservedRunningTime="2026-01-30 18:48:25.235937156 +0000 UTC m=+1081.504315181" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.242861 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" event={"ID":"ac812687-eb03-4784-ad8f-ff91ebd39d28","Type":"ContainerDied","Data":"a33d4478d98375bfc760061cc0d9d00f80bf717f04af79808219696e7ae6cfce"} Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.246073 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658798c9cc-hdk8g" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.281301 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ec7672b9-b70c-4a7e-bda0-e880b0b57555" (UID: "ec7672b9-b70c-4a7e-bda0-e880b0b57555"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.332368 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ec7672b9-b70c-4a7e-bda0-e880b0b57555-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.333371 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.341788 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658798c9cc-hdk8g"] Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.551901 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:25 crc kubenswrapper[4782]: I0130 18:48:25.556883 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bf5f74b79-dx87h"] Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.257391 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.258314 4782 generic.go:334] "Generic (PLEG): container finished" podID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerID="409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe" exitCode=0 Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.258375 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerDied","Data":"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe"} Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.263784 4782 generic.go:334] "Generic (PLEG): container finished" podID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerID="5c3e8dbfd9edf846d7085f465be4ab99ea5db8a559a96dd0e769f75d444cf786" exitCode=143 Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.263819 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerDied","Data":"5c3e8dbfd9edf846d7085f465be4ab99ea5db8a559a96dd0e769f75d444cf786"} Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.422743 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" path="/var/lib/kubelet/pods/ac812687-eb03-4784-ad8f-ff91ebd39d28/volumes" Jan 30 18:48:26 crc kubenswrapper[4782]: I0130 18:48:26.423504 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" path="/var/lib/kubelet/pods/ec7672b9-b70c-4a7e-bda0-e880b0b57555/volumes" Jan 30 18:48:28 crc kubenswrapper[4782]: I0130 18:48:28.982328 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": read tcp 10.217.0.2:57692->10.217.0.150:9322: read: connection reset by peer" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.293880 4782 generic.go:334] "Generic (PLEG): container finished" podID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerID="62130a5829f203b8ec1cd565b6fea1c87394f2712df2e49778f96a1795ddcfd4" exitCode=0 Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.293966 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerDied","Data":"62130a5829f203b8ec1cd565b6fea1c87394f2712df2e49778f96a1795ddcfd4"} Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.299262 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerStarted","Data":"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f"} Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.721063 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.757854 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:48:29 crc kubenswrapper[4782]: E0130 18:48:29.759471 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="init" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.759626 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="init" Jan 30 18:48:29 crc kubenswrapper[4782]: E0130 18:48:29.759779 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="dnsmasq-dns" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.763499 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="dnsmasq-dns" Jan 30 18:48:29 crc kubenswrapper[4782]: E0130 18:48:29.763677 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" containerName="init" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.763779 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" containerName="init" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.764175 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec7672b9-b70c-4a7e-bda0-e880b0b57555" containerName="init" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.765519 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac812687-eb03-4784-ad8f-ff91ebd39d28" containerName="dnsmasq-dns" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.767304 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.772010 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.788942 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.839320 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.849525 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d67b5c94d-pwj69"] Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.851295 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.874728 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d67b5c94d-pwj69"] Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.885960 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886069 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886103 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886286 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886354 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886390 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp5sf\" (UniqueName: \"kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.886423 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.994507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995096 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp5sf\" (UniqueName: \"kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995143 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995213 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-combined-ca-bundle\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995313 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-tls-certs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995335 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-config-data\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995402 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-secret-key\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995837 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995871 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.995936 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.996011 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cblf9\" (UniqueName: \"kubernetes.io/projected/53e414f7-9297-46fb-87b6-19ce7ee55758-kube-api-access-cblf9\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.997308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-scripts\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.997461 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.997498 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e414f7-9297-46fb-87b6-19ce7ee55758-logs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.998428 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.998746 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.999296 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:29 crc kubenswrapper[4782]: I0130 18:48:29.999515 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.009911 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.010262 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.019307 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp5sf\" (UniqueName: \"kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf\") pod \"horizon-b5464cf9b-2tbsc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099138 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cblf9\" (UniqueName: \"kubernetes.io/projected/53e414f7-9297-46fb-87b6-19ce7ee55758-kube-api-access-cblf9\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099225 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-scripts\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099361 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e414f7-9297-46fb-87b6-19ce7ee55758-logs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099426 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-combined-ca-bundle\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099455 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-tls-certs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099484 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-config-data\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.099517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-secret-key\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.100058 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-scripts\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.101424 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53e414f7-9297-46fb-87b6-19ce7ee55758-config-data\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.102250 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53e414f7-9297-46fb-87b6-19ce7ee55758-logs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.105362 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-combined-ca-bundle\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.105633 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-tls-certs\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.108820 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/53e414f7-9297-46fb-87b6-19ce7ee55758-horizon-secret-key\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.108987 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.115644 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cblf9\" (UniqueName: \"kubernetes.io/projected/53e414f7-9297-46fb-87b6-19ce7ee55758-kube-api-access-cblf9\") pod \"horizon-7d67b5c94d-pwj69\" (UID: \"53e414f7-9297-46fb-87b6-19ce7ee55758\") " pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.195057 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.310389 4782 generic.go:334] "Generic (PLEG): container finished" podID="8d9deb04-4e5b-4f44-990d-9c2c0ce06443" containerID="07829a177d4c5bb5f2e54de53ef57ae9464ada31d0843e33381de37958a880f6" exitCode=0 Jan 30 18:48:30 crc kubenswrapper[4782]: I0130 18:48:30.310439 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfrgc" event={"ID":"8d9deb04-4e5b-4f44-990d-9c2c0ce06443","Type":"ContainerDied","Data":"07829a177d4c5bb5f2e54de53ef57ae9464ada31d0843e33381de37958a880f6"} Jan 30 18:48:31 crc kubenswrapper[4782]: I0130 18:48:31.256155 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": dial tcp 10.217.0.150:9322: connect: connection refused" Jan 30 18:48:41 crc kubenswrapper[4782]: I0130 18:48:41.256884 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:48:46 crc kubenswrapper[4782]: I0130 18:48:46.257887 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.845628 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.845700 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.845903 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57fh594h5d7h56h64h595h68ch595h57h658h5c9h5c7h696h68bh58ch54ch56dh689h695h558h59bhbch667h66h698h5b8h565h65dh6h94h686hb8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnd6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5d88448fc-75xln_openstack(3bf0d39e-8ee2-4500-b349-ea63b4df15d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.848276 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-5d88448fc-75xln" podUID="3bf0d39e-8ee2-4500-b349-ea63b4df15d0" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.869956 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.870010 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.870149 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n557h6hb5hddhf6h87h8ch58dh644h648h68h68h55hd5hfdhch584h9h588h5c4hfch544h7ch5d4h685h9h558h675h6dh667hc6h5b6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtxqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bbf49fbcf-9tf5s_openstack(70bb6236-54ab-4d4b-8219-44b7fa48e716): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:48:46 crc kubenswrapper[4782]: E0130 18:48:46.873020 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-bbf49fbcf-9tf5s" podUID="70bb6236-54ab-4d4b-8219-44b7fa48e716" Jan 30 18:48:51 crc kubenswrapper[4782]: I0130 18:48:51.259317 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.342524 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.342919 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.343250 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bbh55bh86h566h569h5dh594hc8h5c4h66bhc7h575h76h88h68dh648h54fh648h5d7h545hbbh579h657h56ch64dhfbh668h5b4h685h594h6ch66fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvnqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-64c9484657-zqf7j_openstack(799a6410-8ee6-42c6-9336-c2f84d59b724): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.346738 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-64c9484657-zqf7j" podUID="799a6410-8ee6-42c6-9336-c2f84d59b724" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.396212 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.396322 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.396611 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.5:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b7f88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-bpwp5_openstack(07adaf47-0b0c-46f9-bf42-fc02ffec87a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.397962 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-bpwp5" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" Jan 30 18:48:54 crc kubenswrapper[4782]: E0130 18:48:54.584779 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-bpwp5" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.261038 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.361324 4782 scope.go:117] "RemoveContainer" containerID="cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe" Jan 30 18:48:56 crc kubenswrapper[4782]: E0130 18:48:56.361712 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe\": container with ID starting with cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe not found: ID does not exist" containerID="cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.361743 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe"} err="failed to get container status \"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe\": rpc error: code = NotFound desc = could not find container \"cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe\": container with ID starting with cccc7e20a87ca18a24f94c0d89651732ed37cb2d5e0ef666563f54bf7fef95fe not found: ID does not exist" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.361765 4782 scope.go:117] "RemoveContainer" containerID="97bbefef290e953f0a9f2ceb37abae11f699d1bfdf4438a148cb27510245ffca" Jan 30 18:48:56 crc kubenswrapper[4782]: E0130 18:48:56.362483 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Jan 30 18:48:56 crc kubenswrapper[4782]: E0130 18:48:56.362509 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Jan 30 18:48:56 crc kubenswrapper[4782]: E0130 18:48:56.362606 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.5:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n695h5c4h75h655h64ch5bfh649h65h54chdch67ch5c9h589h67dh555hdfh5b7h675h654hfchfbh5c9h68fh675hfdhc9h548h648h7dh58fhcch87q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84zqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a13b1ca2-1722-425b-ae48-d34c99f746f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.451042 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524283 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn69f\" (UniqueName: \"kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524494 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524576 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524612 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524663 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.524700 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle\") pod \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\" (UID: \"8d9deb04-4e5b-4f44-990d-9c2c0ce06443\") " Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.530503 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f" (OuterVolumeSpecName: "kube-api-access-bn69f") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "kube-api-access-bn69f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.544537 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.545023 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.548267 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts" (OuterVolumeSpecName: "scripts") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.572006 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.576177 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data" (OuterVolumeSpecName: "config-data") pod "8d9deb04-4e5b-4f44-990d-9c2c0ce06443" (UID: "8d9deb04-4e5b-4f44-990d-9c2c0ce06443"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.597772 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfrgc" event={"ID":"8d9deb04-4e5b-4f44-990d-9c2c0ce06443","Type":"ContainerDied","Data":"009edf0f86365ed08c912458866317b8f78a00655ca9bc877defe39a8922a130"} Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.597799 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfrgc" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.597826 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009edf0f86365ed08c912458866317b8f78a00655ca9bc877defe39a8922a130" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627309 4782 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627345 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627359 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627374 4782 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627386 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:56 crc kubenswrapper[4782]: I0130 18:48:56.627397 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn69f\" (UniqueName: \"kubernetes.io/projected/8d9deb04-4e5b-4f44-990d-9c2c0ce06443-kube-api-access-bn69f\") on node \"crc\" DevicePath \"\"" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.538955 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nfrgc"] Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.546801 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nfrgc"] Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.661859 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h46v5"] Jan 30 18:48:57 crc kubenswrapper[4782]: E0130 18:48:57.662336 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9deb04-4e5b-4f44-990d-9c2c0ce06443" containerName="keystone-bootstrap" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.662355 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9deb04-4e5b-4f44-990d-9c2c0ce06443" containerName="keystone-bootstrap" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.662530 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9deb04-4e5b-4f44-990d-9c2c0ce06443" containerName="keystone-bootstrap" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.663179 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.665171 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.665696 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.666091 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.666282 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7d8px" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.667491 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.677389 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h46v5"] Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.752900 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.752959 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.753156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.753277 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.753465 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.753504 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7b9m\" (UniqueName: \"kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855211 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855321 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855396 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855419 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7b9m\" (UniqueName: \"kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855446 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.855473 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.861508 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.861697 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.862439 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.869958 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.871783 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.873990 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7b9m\" (UniqueName: \"kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m\") pod \"keystone-bootstrap-h46v5\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:57 crc kubenswrapper[4782]: I0130 18:48:57.988716 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:48:58 crc kubenswrapper[4782]: I0130 18:48:58.423775 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9deb04-4e5b-4f44-990d-9c2c0ce06443" path="/var/lib/kubelet/pods/8d9deb04-4e5b-4f44-990d-9c2c0ce06443/volumes" Jan 30 18:49:01 crc kubenswrapper[4782]: I0130 18:49:01.261515 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:49:06 crc kubenswrapper[4782]: I0130 18:49:06.262957 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:49:11 crc kubenswrapper[4782]: I0130 18:49:11.264109 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.265170 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.494328 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.505742 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.512840 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.531603 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658441 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxqc\" (UniqueName: \"kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc\") pod \"70bb6236-54ab-4d4b-8219-44b7fa48e716\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658511 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle\") pod \"d874caba-3ebf-4604-a459-efb84bcf3f69\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658540 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnd6t\" (UniqueName: \"kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t\") pod \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658559 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55nnt\" (UniqueName: \"kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt\") pod \"d874caba-3ebf-4604-a459-efb84bcf3f69\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658593 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts\") pod \"70bb6236-54ab-4d4b-8219-44b7fa48e716\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658613 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data\") pod \"70bb6236-54ab-4d4b-8219-44b7fa48e716\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658654 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs\") pod \"70bb6236-54ab-4d4b-8219-44b7fa48e716\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658673 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs\") pod \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658704 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key\") pod \"799a6410-8ee6-42c6-9336-c2f84d59b724\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658734 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs\") pod \"d874caba-3ebf-4604-a459-efb84bcf3f69\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658772 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts\") pod \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658804 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data\") pod \"799a6410-8ee6-42c6-9336-c2f84d59b724\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658843 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data\") pod \"d874caba-3ebf-4604-a459-efb84bcf3f69\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658866 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts\") pod \"799a6410-8ee6-42c6-9336-c2f84d59b724\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658885 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca\") pod \"d874caba-3ebf-4604-a459-efb84bcf3f69\" (UID: \"d874caba-3ebf-4604-a459-efb84bcf3f69\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658903 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key\") pod \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658936 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs\") pod \"799a6410-8ee6-42c6-9336-c2f84d59b724\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.658967 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data\") pod \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\" (UID: \"3bf0d39e-8ee2-4500-b349-ea63b4df15d0\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.659000 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvnqv\" (UniqueName: \"kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv\") pod \"799a6410-8ee6-42c6-9336-c2f84d59b724\" (UID: \"799a6410-8ee6-42c6-9336-c2f84d59b724\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.659028 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key\") pod \"70bb6236-54ab-4d4b-8219-44b7fa48e716\" (UID: \"70bb6236-54ab-4d4b-8219-44b7fa48e716\") " Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.659271 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs" (OuterVolumeSpecName: "logs") pod "d874caba-3ebf-4604-a459-efb84bcf3f69" (UID: "d874caba-3ebf-4604-a459-efb84bcf3f69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.659392 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d874caba-3ebf-4604-a459-efb84bcf3f69-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.659749 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts" (OuterVolumeSpecName: "scripts") pod "3bf0d39e-8ee2-4500-b349-ea63b4df15d0" (UID: "3bf0d39e-8ee2-4500-b349-ea63b4df15d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.660286 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data" (OuterVolumeSpecName: "config-data") pod "799a6410-8ee6-42c6-9336-c2f84d59b724" (UID: "799a6410-8ee6-42c6-9336-c2f84d59b724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.660999 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs" (OuterVolumeSpecName: "logs") pod "70bb6236-54ab-4d4b-8219-44b7fa48e716" (UID: "70bb6236-54ab-4d4b-8219-44b7fa48e716"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.661588 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs" (OuterVolumeSpecName: "logs") pod "3bf0d39e-8ee2-4500-b349-ea63b4df15d0" (UID: "3bf0d39e-8ee2-4500-b349-ea63b4df15d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.661894 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs" (OuterVolumeSpecName: "logs") pod "799a6410-8ee6-42c6-9336-c2f84d59b724" (UID: "799a6410-8ee6-42c6-9336-c2f84d59b724"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.661944 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts" (OuterVolumeSpecName: "scripts") pod "70bb6236-54ab-4d4b-8219-44b7fa48e716" (UID: "70bb6236-54ab-4d4b-8219-44b7fa48e716"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.661983 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data" (OuterVolumeSpecName: "config-data") pod "70bb6236-54ab-4d4b-8219-44b7fa48e716" (UID: "70bb6236-54ab-4d4b-8219-44b7fa48e716"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.662214 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc" (OuterVolumeSpecName: "kube-api-access-xtxqc") pod "70bb6236-54ab-4d4b-8219-44b7fa48e716" (UID: "70bb6236-54ab-4d4b-8219-44b7fa48e716"). InnerVolumeSpecName "kube-api-access-xtxqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.662440 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts" (OuterVolumeSpecName: "scripts") pod "799a6410-8ee6-42c6-9336-c2f84d59b724" (UID: "799a6410-8ee6-42c6-9336-c2f84d59b724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.662520 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data" (OuterVolumeSpecName: "config-data") pod "3bf0d39e-8ee2-4500-b349-ea63b4df15d0" (UID: "3bf0d39e-8ee2-4500-b349-ea63b4df15d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.664200 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "799a6410-8ee6-42c6-9336-c2f84d59b724" (UID: "799a6410-8ee6-42c6-9336-c2f84d59b724"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.665427 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv" (OuterVolumeSpecName: "kube-api-access-gvnqv") pod "799a6410-8ee6-42c6-9336-c2f84d59b724" (UID: "799a6410-8ee6-42c6-9336-c2f84d59b724"). InnerVolumeSpecName "kube-api-access-gvnqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.665884 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3bf0d39e-8ee2-4500-b349-ea63b4df15d0" (UID: "3bf0d39e-8ee2-4500-b349-ea63b4df15d0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.666066 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t" (OuterVolumeSpecName: "kube-api-access-cnd6t") pod "3bf0d39e-8ee2-4500-b349-ea63b4df15d0" (UID: "3bf0d39e-8ee2-4500-b349-ea63b4df15d0"). InnerVolumeSpecName "kube-api-access-cnd6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.666625 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "70bb6236-54ab-4d4b-8219-44b7fa48e716" (UID: "70bb6236-54ab-4d4b-8219-44b7fa48e716"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.667295 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt" (OuterVolumeSpecName: "kube-api-access-55nnt") pod "d874caba-3ebf-4604-a459-efb84bcf3f69" (UID: "d874caba-3ebf-4604-a459-efb84bcf3f69"). InnerVolumeSpecName "kube-api-access-55nnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.692862 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "d874caba-3ebf-4604-a459-efb84bcf3f69" (UID: "d874caba-3ebf-4604-a459-efb84bcf3f69"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.696377 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d874caba-3ebf-4604-a459-efb84bcf3f69" (UID: "d874caba-3ebf-4604-a459-efb84bcf3f69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.721572 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data" (OuterVolumeSpecName: "config-data") pod "d874caba-3ebf-4604-a459-efb84bcf3f69" (UID: "d874caba-3ebf-4604-a459-efb84bcf3f69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760530 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760558 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760568 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760576 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799a6410-8ee6-42c6-9336-c2f84d59b724-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760584 4782 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760594 4782 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760602 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799a6410-8ee6-42c6-9336-c2f84d59b724-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760610 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760618 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvnqv\" (UniqueName: \"kubernetes.io/projected/799a6410-8ee6-42c6-9336-c2f84d59b724-kube-api-access-gvnqv\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760626 4782 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70bb6236-54ab-4d4b-8219-44b7fa48e716-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760634 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxqc\" (UniqueName: \"kubernetes.io/projected/70bb6236-54ab-4d4b-8219-44b7fa48e716-kube-api-access-xtxqc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760642 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d874caba-3ebf-4604-a459-efb84bcf3f69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760651 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55nnt\" (UniqueName: \"kubernetes.io/projected/d874caba-3ebf-4604-a459-efb84bcf3f69-kube-api-access-55nnt\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760658 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnd6t\" (UniqueName: \"kubernetes.io/projected/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-kube-api-access-cnd6t\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760666 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760674 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70bb6236-54ab-4d4b-8219-44b7fa48e716-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760683 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70bb6236-54ab-4d4b-8219-44b7fa48e716-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760691 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bf0d39e-8ee2-4500-b349-ea63b4df15d0-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.760701 4782 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799a6410-8ee6-42c6-9336-c2f84d59b724-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.805333 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d88448fc-75xln" event={"ID":"3bf0d39e-8ee2-4500-b349-ea63b4df15d0","Type":"ContainerDied","Data":"058f3ddba75bddd599412c7beb31e93d6e4d81acea96e4eb1179a8d319f11a87"} Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.805360 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d88448fc-75xln" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.807775 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64c9484657-zqf7j" event={"ID":"799a6410-8ee6-42c6-9336-c2f84d59b724","Type":"ContainerDied","Data":"c3593caf472d8a22294c62766e1f564ee175ed6ca88e17ef7f7d9353c2a071b7"} Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.807836 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64c9484657-zqf7j" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.812046 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbf49fbcf-9tf5s" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.812067 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbf49fbcf-9tf5s" event={"ID":"70bb6236-54ab-4d4b-8219-44b7fa48e716","Type":"ContainerDied","Data":"f4dfebfac2b07084b6425c4fb7faf9bbc37197ef888f915a271ef080a1038db9"} Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.813871 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"d874caba-3ebf-4604-a459-efb84bcf3f69","Type":"ContainerDied","Data":"d505a37877bdf0de7e5894e3860a68a3756d552a9b6403605566a883203a9ac8"} Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.813990 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.888046 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.898891 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d88448fc-75xln"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.916625 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.940758 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64c9484657-zqf7j"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.954110 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.967649 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.989316 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:16 crc kubenswrapper[4782]: E0130 18:49:16.990706 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.990736 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" Jan 30 18:49:16 crc kubenswrapper[4782]: E0130 18:49:16.990862 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api-log" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.990884 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api-log" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.991134 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.991169 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api-log" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.992728 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:16 crc kubenswrapper[4782]: I0130 18:49:16.994521 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.000684 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.005379 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.015348 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bbf49fbcf-9tf5s"] Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.165727 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.165782 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzmk\" (UniqueName: \"kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.165826 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.165842 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.166357 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.268787 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.269180 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.269595 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.269754 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzmk\" (UniqueName: \"kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.269866 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.269896 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.272839 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.273865 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.278422 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.288096 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzmk\" (UniqueName: \"kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk\") pod \"watcher-api-0\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: E0130 18:49:17.291362 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Jan 30 18:49:17 crc kubenswrapper[4782]: E0130 18:49:17.291421 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Jan 30 18:49:17 crc kubenswrapper[4782]: E0130 18:49:17.291528 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.5:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqb9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-f7fq6_openstack(47aa0756-718b-4d1c-bef5-318895ee6c90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:49:17 crc kubenswrapper[4782]: E0130 18:49:17.292662 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-f7fq6" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" Jan 30 18:49:17 crc kubenswrapper[4782]: I0130 18:49:17.314365 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:17 crc kubenswrapper[4782]: E0130 18:49:17.834873 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-f7fq6" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" Jan 30 18:49:18 crc kubenswrapper[4782]: I0130 18:49:18.425035 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf0d39e-8ee2-4500-b349-ea63b4df15d0" path="/var/lib/kubelet/pods/3bf0d39e-8ee2-4500-b349-ea63b4df15d0/volumes" Jan 30 18:49:18 crc kubenswrapper[4782]: I0130 18:49:18.425972 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70bb6236-54ab-4d4b-8219-44b7fa48e716" path="/var/lib/kubelet/pods/70bb6236-54ab-4d4b-8219-44b7fa48e716/volumes" Jan 30 18:49:18 crc kubenswrapper[4782]: I0130 18:49:18.427884 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799a6410-8ee6-42c6-9336-c2f84d59b724" path="/var/lib/kubelet/pods/799a6410-8ee6-42c6-9336-c2f84d59b724/volumes" Jan 30 18:49:18 crc kubenswrapper[4782]: I0130 18:49:18.428877 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" path="/var/lib/kubelet/pods/d874caba-3ebf-4604-a459-efb84bcf3f69/volumes" Jan 30 18:49:18 crc kubenswrapper[4782]: E0130 18:49:18.862033 4782 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Jan 30 18:49:18 crc kubenswrapper[4782]: E0130 18:49:18.862104 4782 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Jan 30 18:49:18 crc kubenswrapper[4782]: E0130 18:49:18.862308 4782 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.5:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vxdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mpvf8_openstack(a9ddf9ab-a21e-4d41-b795-c0c926e38a1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 18:49:18 crc kubenswrapper[4782]: E0130 18:49:18.863841 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mpvf8" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" Jan 30 18:49:19 crc kubenswrapper[4782]: I0130 18:49:19.043467 4782 scope.go:117] "RemoveContainer" containerID="440897237abf64b329282b2bedba4af81f1f7437554cfd73f1472646b45046ee" Jan 30 18:49:19 crc kubenswrapper[4782]: I0130 18:49:19.692316 4782 scope.go:117] "RemoveContainer" containerID="62130a5829f203b8ec1cd565b6fea1c87394f2712df2e49778f96a1795ddcfd4" Jan 30 18:49:19 crc kubenswrapper[4782]: I0130 18:49:19.838545 4782 scope.go:117] "RemoveContainer" containerID="5c3e8dbfd9edf846d7085f465be4ab99ea5db8a559a96dd0e769f75d444cf786" Jan 30 18:49:19 crc kubenswrapper[4782]: I0130 18:49:19.865796 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerStarted","Data":"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9"} Jan 30 18:49:19 crc kubenswrapper[4782]: E0130 18:49:19.867063 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-mpvf8" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.217686 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h46v5"] Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.228926 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.241728 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d67b5c94d-pwj69"] Jan 30 18:49:20 crc kubenswrapper[4782]: W0130 18:49:20.249616 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17105ad5_4156_493d_95ee_e27a1d4e8622.slice/crio-61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255 WatchSource:0}: Error finding container 61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255: Status 404 returned error can't find the container with id 61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255 Jan 30 18:49:20 crc kubenswrapper[4782]: W0130 18:49:20.250106 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53e414f7_9297_46fb_87b6_19ce7ee55758.slice/crio-cd549f83175d1e6fb6c0346b6b7b15b3589d611ef01f16c86a24981e92d7150f WatchSource:0}: Error finding container cd549f83175d1e6fb6c0346b6b7b15b3589d611ef01f16c86a24981e92d7150f: Status 404 returned error can't find the container with id cd549f83175d1e6fb6c0346b6b7b15b3589d611ef01f16c86a24981e92d7150f Jan 30 18:49:20 crc kubenswrapper[4782]: W0130 18:49:20.256665 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2e303df_cb69_4f2f_909f_a1651d376adc.slice/crio-7089e04dabf15ad197d928d57d70d81c4b4244b3196f530352d3562c2a99555e WatchSource:0}: Error finding container 7089e04dabf15ad197d928d57d70d81c4b4244b3196f530352d3562c2a99555e: Status 404 returned error can't find the container with id 7089e04dabf15ad197d928d57d70d81c4b4244b3196f530352d3562c2a99555e Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.399747 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.879793 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerStarted","Data":"59bbaedbc5f389bda7fcdb690fa845c62355e675c843254ca737016673e32a48"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.881300 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerStarted","Data":"7089e04dabf15ad197d928d57d70d81c4b4244b3196f530352d3562c2a99555e"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.882832 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p9dwp" event={"ID":"c0087b5d-ef94-4433-9e0e-23b509dd3003","Type":"ContainerStarted","Data":"961c540b3814e2a30632333589d5245e623a044016634d43078e0baed9bcfd9a"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.898045 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h46v5" event={"ID":"17105ad5-4156-493d-95ee-e27a1d4e8622","Type":"ContainerStarted","Data":"e065c5928b93422455efb54aab9b90775c8fff25f56b7b96a48380a896d91931"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.900787 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h46v5" event={"ID":"17105ad5-4156-493d-95ee-e27a1d4e8622","Type":"ContainerStarted","Data":"61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.907393 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerStarted","Data":"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.907564 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.910295 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-p9dwp" podStartSLOduration=5.023111097 podStartE2EDuration="59.910279433s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="2026-01-30 18:48:23.838691631 +0000 UTC m=+1080.107069656" lastFinishedPulling="2026-01-30 18:49:18.725859937 +0000 UTC m=+1134.994237992" observedRunningTime="2026-01-30 18:49:20.898104242 +0000 UTC m=+1137.166482297" watchObservedRunningTime="2026-01-30 18:49:20.910279433 +0000 UTC m=+1137.178657458" Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.910970 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2db1b143-b7ce-4cc4-8412-6e3402508e98","Type":"ContainerStarted","Data":"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.912410 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d67b5c94d-pwj69" event={"ID":"53e414f7-9297-46fb-87b6-19ce7ee55758","Type":"ContainerStarted","Data":"cd549f83175d1e6fb6c0346b6b7b15b3589d611ef01f16c86a24981e92d7150f"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.913673 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"627f3bc8-cc79-4500-81d4-4dc041b88394","Type":"ContainerStarted","Data":"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.919263 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerStarted","Data":"998a8f9c6850a8020aeea9b233da454493113420844d9bbcb36d0ba8317f16f3"} Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.961443 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h46v5" podStartSLOduration=23.961420908 podStartE2EDuration="23.961420908s" podCreationTimestamp="2026-01-30 18:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:20.935845136 +0000 UTC m=+1137.204223171" watchObservedRunningTime="2026-01-30 18:49:20.961420908 +0000 UTC m=+1137.229798943" Jan 30 18:49:20 crc kubenswrapper[4782]: I0130 18:49:20.970977 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=7.353346518 podStartE2EDuration="1m0.970954004s" podCreationTimestamp="2026-01-30 18:48:20 +0000 UTC" firstStartedPulling="2026-01-30 18:48:22.726331652 +0000 UTC m=+1078.994709677" lastFinishedPulling="2026-01-30 18:49:16.343939138 +0000 UTC m=+1132.612317163" observedRunningTime="2026-01-30 18:49:20.954405104 +0000 UTC m=+1137.222783139" watchObservedRunningTime="2026-01-30 18:49:20.970954004 +0000 UTC m=+1137.239332039" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.018600 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=69.018572141 podStartE2EDuration="1m9.018572141s" podCreationTimestamp="2026-01-30 18:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:21.003744205 +0000 UTC m=+1137.272122230" watchObservedRunningTime="2026-01-30 18:49:21.018572141 +0000 UTC m=+1137.286950176" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.034071 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=5.971400517 podStartE2EDuration="1m0.034049684s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="2026-01-30 18:48:23.214127747 +0000 UTC m=+1079.482505772" lastFinishedPulling="2026-01-30 18:49:17.276776894 +0000 UTC m=+1133.545154939" observedRunningTime="2026-01-30 18:49:21.023971795 +0000 UTC m=+1137.292349820" watchObservedRunningTime="2026-01-30 18:49:21.034049684 +0000 UTC m=+1137.302427719" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.266640 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="d874caba-3ebf-4604-a459-efb84bcf3f69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.781296 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.781628 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.808546 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.808585 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.808661 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.832172 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" podStartSLOduration=60.83215121 podStartE2EDuration="1m0.83215121s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:21.042429841 +0000 UTC m=+1137.310807866" watchObservedRunningTime="2026-01-30 18:49:21.83215121 +0000 UTC m=+1138.100529235" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.835871 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.940434 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bpwp5" event={"ID":"07adaf47-0b0c-46f9-bf42-fc02ffec87a4","Type":"ContainerStarted","Data":"3cb806c040408344fb39ee4e0454be212dc4c5bafa67da379f195fafca08c84b"} Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.943888 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerStarted","Data":"3b0c40302c78a633b6f37037e995da1f93a66ca65ecef842dccae7a4f41bb7d8"} Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.963014 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-bpwp5" podStartSLOduration=3.618657656 podStartE2EDuration="1m6.962994995s" podCreationTimestamp="2026-01-30 18:48:15 +0000 UTC" firstStartedPulling="2026-01-30 18:48:16.359707375 +0000 UTC m=+1072.628085400" lastFinishedPulling="2026-01-30 18:49:19.704044724 +0000 UTC m=+1135.972422739" observedRunningTime="2026-01-30 18:49:21.961726644 +0000 UTC m=+1138.230104679" watchObservedRunningTime="2026-01-30 18:49:21.962994995 +0000 UTC m=+1138.231373020" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.975344 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Jan 30 18:49:21 crc kubenswrapper[4782]: I0130 18:49:21.979524 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:22 crc kubenswrapper[4782]: I0130 18:49:22.035672 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:22 crc kubenswrapper[4782]: I0130 18:49:22.049389 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:22 crc kubenswrapper[4782]: I0130 18:49:22.605070 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 18:49:23 crc kubenswrapper[4782]: I0130 18:49:23.972031 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="627f3bc8-cc79-4500-81d4-4dc041b88394" containerName="watcher-decision-engine" containerID="cri-o://be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4" gracePeriod=30 Jan 30 18:49:23 crc kubenswrapper[4782]: I0130 18:49:23.972125 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" containerID="cri-o://89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" gracePeriod=30 Jan 30 18:49:24 crc kubenswrapper[4782]: I0130 18:49:24.986633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d67b5c94d-pwj69" event={"ID":"53e414f7-9297-46fb-87b6-19ce7ee55758","Type":"ContainerStarted","Data":"e3ffd64d6b4b6cfa59e8fb18ab3bcdc4a96156ccf5ae37c7e7a81cf00a9a6af1"} Jan 30 18:49:24 crc kubenswrapper[4782]: I0130 18:49:24.986672 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d67b5c94d-pwj69" event={"ID":"53e414f7-9297-46fb-87b6-19ce7ee55758","Type":"ContainerStarted","Data":"9702697e78345aa2295f005a042f0f8f8601206a988ac9e5a0f5fbef20f429bb"} Jan 30 18:49:24 crc kubenswrapper[4782]: I0130 18:49:24.996040 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerStarted","Data":"43350e6776d7465ef3ffaa98b5d0ba0724a4e670f4fd1d20ad1a8f73ea885eb7"} Jan 30 18:49:24 crc kubenswrapper[4782]: I0130 18:49:24.996294 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:49:25 crc kubenswrapper[4782]: I0130 18:49:25.008668 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerStarted","Data":"721c73bb806b44f16ca65b0e3d3bdf2b72298ad7af4c050b8fb757c30b1fb414"} Jan 30 18:49:25 crc kubenswrapper[4782]: I0130 18:49:25.008750 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerStarted","Data":"441cb98b7bdcb9129ff7e175460fd76e74ce990bde17ddae86c8e1c5a95f2c84"} Jan 30 18:49:25 crc kubenswrapper[4782]: I0130 18:49:25.018062 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d67b5c94d-pwj69" podStartSLOduration=52.095444551 podStartE2EDuration="56.018040229s" podCreationTimestamp="2026-01-30 18:48:29 +0000 UTC" firstStartedPulling="2026-01-30 18:49:20.251621975 +0000 UTC m=+1136.520000000" lastFinishedPulling="2026-01-30 18:49:24.174217653 +0000 UTC m=+1140.442595678" observedRunningTime="2026-01-30 18:49:25.017750652 +0000 UTC m=+1141.286128677" watchObservedRunningTime="2026-01-30 18:49:25.018040229 +0000 UTC m=+1141.286418254" Jan 30 18:49:25 crc kubenswrapper[4782]: I0130 18:49:25.047061 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=9.047041686 podStartE2EDuration="9.047041686s" podCreationTimestamp="2026-01-30 18:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:25.040878824 +0000 UTC m=+1141.309256849" watchObservedRunningTime="2026-01-30 18:49:25.047041686 +0000 UTC m=+1141.315419711" Jan 30 18:49:25 crc kubenswrapper[4782]: I0130 18:49:25.062529 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b5464cf9b-2tbsc" podStartSLOduration=52.12931996 podStartE2EDuration="56.062512459s" podCreationTimestamp="2026-01-30 18:48:29 +0000 UTC" firstStartedPulling="2026-01-30 18:49:20.261052259 +0000 UTC m=+1136.529430284" lastFinishedPulling="2026-01-30 18:49:24.194244758 +0000 UTC m=+1140.462622783" observedRunningTime="2026-01-30 18:49:25.059007952 +0000 UTC m=+1141.327385977" watchObservedRunningTime="2026-01-30 18:49:25.062512459 +0000 UTC m=+1141.330890484" Jan 30 18:49:26 crc kubenswrapper[4782]: I0130 18:49:26.022432 4782 generic.go:334] "Generic (PLEG): container finished" podID="17105ad5-4156-493d-95ee-e27a1d4e8622" containerID="e065c5928b93422455efb54aab9b90775c8fff25f56b7b96a48380a896d91931" exitCode=0 Jan 30 18:49:26 crc kubenswrapper[4782]: I0130 18:49:26.022511 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h46v5" event={"ID":"17105ad5-4156-493d-95ee-e27a1d4e8622","Type":"ContainerDied","Data":"e065c5928b93422455efb54aab9b90775c8fff25f56b7b96a48380a896d91931"} Jan 30 18:49:26 crc kubenswrapper[4782]: E0130 18:49:26.782381 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:26 crc kubenswrapper[4782]: E0130 18:49:26.783817 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:26 crc kubenswrapper[4782]: E0130 18:49:26.784855 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:26 crc kubenswrapper[4782]: E0130 18:49:26.784883 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.147721 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.313650 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.313694 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.327967 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.404458 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.480598 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.481100 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="dnsmasq-dns" containerID="cri-o://fb83cbd6b0e8b328ce1008786a222eced5d79485fdf4a18d7bc18dcf71bc4267" gracePeriod=10 Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.605196 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 18:49:27 crc kubenswrapper[4782]: I0130 18:49:27.612523 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 18:49:28 crc kubenswrapper[4782]: I0130 18:49:28.042971 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 18:49:28 crc kubenswrapper[4782]: I0130 18:49:28.045321 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 18:49:28 crc kubenswrapper[4782]: I0130 18:49:28.564315 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.060804 4782 generic.go:334] "Generic (PLEG): container finished" podID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerID="fb83cbd6b0e8b328ce1008786a222eced5d79485fdf4a18d7bc18dcf71bc4267" exitCode=0 Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.061018 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerDied","Data":"fb83cbd6b0e8b328ce1008786a222eced5d79485fdf4a18d7bc18dcf71bc4267"} Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.110101 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.110165 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.204992 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:49:30 crc kubenswrapper[4782]: I0130 18:49:30.205042 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.180639 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.181053 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api" containerID="cri-o://43350e6776d7465ef3ffaa98b5d0ba0724a4e670f4fd1d20ad1a8f73ea885eb7" gracePeriod=30 Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.181258 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api-log" containerID="cri-o://3b0c40302c78a633b6f37037e995da1f93a66ca65ecef842dccae7a4f41bb7d8" gracePeriod=30 Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.388975 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.540960 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.541060 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.541100 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7b9m\" (UniqueName: \"kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.541136 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.541174 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.541250 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data\") pod \"17105ad5-4156-493d-95ee-e27a1d4e8622\" (UID: \"17105ad5-4156-493d-95ee-e27a1d4e8622\") " Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.547369 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.559424 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts" (OuterVolumeSpecName: "scripts") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.563374 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.563452 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m" (OuterVolumeSpecName: "kube-api-access-h7b9m") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "kube-api-access-h7b9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.571362 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data" (OuterVolumeSpecName: "config-data") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.571561 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17105ad5-4156-493d-95ee-e27a1d4e8622" (UID: "17105ad5-4156-493d-95ee-e27a1d4e8622"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643646 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643695 4782 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643707 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7b9m\" (UniqueName: \"kubernetes.io/projected/17105ad5-4156-493d-95ee-e27a1d4e8622-kube-api-access-h7b9m\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643717 4782 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643726 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: I0130 18:49:31.643733 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17105ad5-4156-493d-95ee-e27a1d4e8622-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:31 crc kubenswrapper[4782]: E0130 18:49:31.783376 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:31 crc kubenswrapper[4782]: E0130 18:49:31.785011 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:31 crc kubenswrapper[4782]: E0130 18:49:31.786478 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:31 crc kubenswrapper[4782]: E0130 18:49:31.786520 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.103329 4782 generic.go:334] "Generic (PLEG): container finished" podID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerID="3b0c40302c78a633b6f37037e995da1f93a66ca65ecef842dccae7a4f41bb7d8" exitCode=143 Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.103423 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerDied","Data":"3b0c40302c78a633b6f37037e995da1f93a66ca65ecef842dccae7a4f41bb7d8"} Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.105778 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h46v5" event={"ID":"17105ad5-4156-493d-95ee-e27a1d4e8622","Type":"ContainerDied","Data":"61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255"} Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.105819 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61b9fcf1275fa7a0373072f04c1dc14458b84dc12f6aff576094699ffe817255" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.105887 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h46v5" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.315033 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.165:9322/\": dial tcp 10.217.0.165:9322: connect: connection refused" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.315150 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9322/\": dial tcp 10.217.0.165:9322: connect: connection refused" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.509791 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.570325 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5c6f877b5f-8gdbg"] Jan 30 18:49:32 crc kubenswrapper[4782]: E0130 18:49:32.570811 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17105ad5-4156-493d-95ee-e27a1d4e8622" containerName="keystone-bootstrap" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.570833 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="17105ad5-4156-493d-95ee-e27a1d4e8622" containerName="keystone-bootstrap" Jan 30 18:49:32 crc kubenswrapper[4782]: E0130 18:49:32.570851 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="dnsmasq-dns" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.570861 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="dnsmasq-dns" Jan 30 18:49:32 crc kubenswrapper[4782]: E0130 18:49:32.570876 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="init" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.570885 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="init" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.571132 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="17105ad5-4156-493d-95ee-e27a1d4e8622" containerName="keystone-bootstrap" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.571166 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" containerName="dnsmasq-dns" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.571935 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.577991 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.578036 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.578169 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.578283 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.578346 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7d8px" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.580066 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.580986 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c6f877b5f-8gdbg"] Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.661623 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc\") pod \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.661696 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb\") pod \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.661777 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb\") pod \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.661867 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r4f5\" (UniqueName: \"kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5\") pod \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.661891 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config\") pod \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\" (UID: \"87763cd9-7b99-4b9a-8e0f-02ea849a6b56\") " Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662140 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-credential-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662175 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-combined-ca-bundle\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662208 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-fernet-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662247 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-config-data\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-public-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-internal-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662330 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-scripts\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.662358 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gfg\" (UniqueName: \"kubernetes.io/projected/199910fe-a283-4898-bd2b-69b6e1b7266b-kube-api-access-67gfg\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.666829 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5" (OuterVolumeSpecName: "kube-api-access-8r4f5") pod "87763cd9-7b99-4b9a-8e0f-02ea849a6b56" (UID: "87763cd9-7b99-4b9a-8e0f-02ea849a6b56"). InnerVolumeSpecName "kube-api-access-8r4f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.715559 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87763cd9-7b99-4b9a-8e0f-02ea849a6b56" (UID: "87763cd9-7b99-4b9a-8e0f-02ea849a6b56"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.717314 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87763cd9-7b99-4b9a-8e0f-02ea849a6b56" (UID: "87763cd9-7b99-4b9a-8e0f-02ea849a6b56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.725067 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87763cd9-7b99-4b9a-8e0f-02ea849a6b56" (UID: "87763cd9-7b99-4b9a-8e0f-02ea849a6b56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.735496 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config" (OuterVolumeSpecName: "config") pod "87763cd9-7b99-4b9a-8e0f-02ea849a6b56" (UID: "87763cd9-7b99-4b9a-8e0f-02ea849a6b56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766654 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-credential-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766743 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-combined-ca-bundle\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766777 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-fernet-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766827 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-config-data\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766850 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-public-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766899 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-internal-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766925 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-scripts\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.766978 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67gfg\" (UniqueName: \"kubernetes.io/projected/199910fe-a283-4898-bd2b-69b6e1b7266b-kube-api-access-67gfg\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.767052 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.767063 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.767073 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.767082 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r4f5\" (UniqueName: \"kubernetes.io/projected/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-kube-api-access-8r4f5\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.767090 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87763cd9-7b99-4b9a-8e0f-02ea849a6b56-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.770794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-internal-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.770830 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-credential-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.770853 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-scripts\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.771072 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-fernet-keys\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.772447 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-config-data\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.773058 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-combined-ca-bundle\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.781254 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/199910fe-a283-4898-bd2b-69b6e1b7266b-public-tls-certs\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.788476 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67gfg\" (UniqueName: \"kubernetes.io/projected/199910fe-a283-4898-bd2b-69b6e1b7266b-kube-api-access-67gfg\") pod \"keystone-5c6f877b5f-8gdbg\" (UID: \"199910fe-a283-4898-bd2b-69b6e1b7266b\") " pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:32 crc kubenswrapper[4782]: I0130 18:49:32.890426 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.119041 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" event={"ID":"87763cd9-7b99-4b9a-8e0f-02ea849a6b56","Type":"ContainerDied","Data":"69de1d0db2a5cd41e7321dcadb6d495c25ab9bfc48f453ac6d03e314608d82f9"} Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.119569 4782 scope.go:117] "RemoveContainer" containerID="fb83cbd6b0e8b328ce1008786a222eced5d79485fdf4a18d7bc18dcf71bc4267" Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.119735 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84ddc495b5-zgrm6" Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.124003 4782 generic.go:334] "Generic (PLEG): container finished" podID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerID="43350e6776d7465ef3ffaa98b5d0ba0724a4e670f4fd1d20ad1a8f73ea885eb7" exitCode=0 Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.124108 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerDied","Data":"43350e6776d7465ef3ffaa98b5d0ba0724a4e670f4fd1d20ad1a8f73ea885eb7"} Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.146411 4782 scope.go:117] "RemoveContainer" containerID="4cc769be6d390810ecd41ff54dfb748892cb5b5e4147d5bbde48823235e1945a" Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.171274 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.181346 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84ddc495b5-zgrm6"] Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.372081 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c6f877b5f-8gdbg"] Jan 30 18:49:33 crc kubenswrapper[4782]: I0130 18:49:33.915872 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.095736 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs\") pod \"125b8d39-013e-4bcf-90cb-2612455df4fb\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.095851 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzzmk\" (UniqueName: \"kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk\") pod \"125b8d39-013e-4bcf-90cb-2612455df4fb\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.095967 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca\") pod \"125b8d39-013e-4bcf-90cb-2612455df4fb\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.096030 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data\") pod \"125b8d39-013e-4bcf-90cb-2612455df4fb\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.096101 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle\") pod \"125b8d39-013e-4bcf-90cb-2612455df4fb\" (UID: \"125b8d39-013e-4bcf-90cb-2612455df4fb\") " Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.101104 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs" (OuterVolumeSpecName: "logs") pod "125b8d39-013e-4bcf-90cb-2612455df4fb" (UID: "125b8d39-013e-4bcf-90cb-2612455df4fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.103204 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk" (OuterVolumeSpecName: "kube-api-access-fzzmk") pod "125b8d39-013e-4bcf-90cb-2612455df4fb" (UID: "125b8d39-013e-4bcf-90cb-2612455df4fb"). InnerVolumeSpecName "kube-api-access-fzzmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.126457 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "125b8d39-013e-4bcf-90cb-2612455df4fb" (UID: "125b8d39-013e-4bcf-90cb-2612455df4fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.133143 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c6f877b5f-8gdbg" event={"ID":"199910fe-a283-4898-bd2b-69b6e1b7266b","Type":"ContainerStarted","Data":"b7e4ce5aa458393de67da03092d0b8e7344c92c85581cc32ef3705c30474e462"} Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.135980 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"125b8d39-013e-4bcf-90cb-2612455df4fb","Type":"ContainerDied","Data":"59bbaedbc5f389bda7fcdb690fa845c62355e675c843254ca737016673e32a48"} Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.136009 4782 scope.go:117] "RemoveContainer" containerID="43350e6776d7465ef3ffaa98b5d0ba0724a4e670f4fd1d20ad1a8f73ea885eb7" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.136150 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.145283 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "125b8d39-013e-4bcf-90cb-2612455df4fb" (UID: "125b8d39-013e-4bcf-90cb-2612455df4fb"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.155200 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data" (OuterVolumeSpecName: "config-data") pod "125b8d39-013e-4bcf-90cb-2612455df4fb" (UID: "125b8d39-013e-4bcf-90cb-2612455df4fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.157434 4782 scope.go:117] "RemoveContainer" containerID="3b0c40302c78a633b6f37037e995da1f93a66ca65ecef842dccae7a4f41bb7d8" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.199290 4782 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.199320 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.199330 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b8d39-013e-4bcf-90cb-2612455df4fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.199338 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/125b8d39-013e-4bcf-90cb-2612455df4fb-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.199351 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzzmk\" (UniqueName: \"kubernetes.io/projected/125b8d39-013e-4bcf-90cb-2612455df4fb-kube-api-access-fzzmk\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.427341 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87763cd9-7b99-4b9a-8e0f-02ea849a6b56" path="/var/lib/kubelet/pods/87763cd9-7b99-4b9a-8e0f-02ea849a6b56/volumes" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.495314 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.516536 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.526862 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:34 crc kubenswrapper[4782]: E0130 18:49:34.527711 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.527836 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api" Jan 30 18:49:34 crc kubenswrapper[4782]: E0130 18:49:34.527982 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api-log" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.528081 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api-log" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.529178 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api-log" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.529377 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" containerName="watcher-api" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.531110 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.533111 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.533656 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.536839 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.537566 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.710000 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.712160 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf22g\" (UniqueName: \"kubernetes.io/projected/68609276-cd5e-43d1-bef5-c79ef0628d5b-kube-api-access-lf22g\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.712399 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.712585 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68609276-cd5e-43d1-bef5-c79ef0628d5b-logs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.712790 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.712980 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-config-data\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.713125 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.815624 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.815717 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68609276-cd5e-43d1-bef5-c79ef0628d5b-logs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.815782 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.815844 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-config-data\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.815873 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.816027 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.816107 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf22g\" (UniqueName: \"kubernetes.io/projected/68609276-cd5e-43d1-bef5-c79ef0628d5b-kube-api-access-lf22g\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.816531 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68609276-cd5e-43d1-bef5-c79ef0628d5b-logs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.823946 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.824674 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.825123 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-config-data\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.835073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.835594 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/68609276-cd5e-43d1-bef5-c79ef0628d5b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.850018 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf22g\" (UniqueName: \"kubernetes.io/projected/68609276-cd5e-43d1-bef5-c79ef0628d5b-kube-api-access-lf22g\") pod \"watcher-api-0\" (UID: \"68609276-cd5e-43d1-bef5-c79ef0628d5b\") " pod="openstack/watcher-api-0" Jan 30 18:49:34 crc kubenswrapper[4782]: I0130 18:49:34.852823 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Jan 30 18:49:35 crc kubenswrapper[4782]: I0130 18:49:35.147602 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c6f877b5f-8gdbg" event={"ID":"199910fe-a283-4898-bd2b-69b6e1b7266b","Type":"ContainerStarted","Data":"fbfde63f1d542e8f493150cc2a98d33a46f0ef9aa014664ecdfe66f7b1ae8aaa"} Jan 30 18:49:35 crc kubenswrapper[4782]: I0130 18:49:35.148308 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:49:35 crc kubenswrapper[4782]: I0130 18:49:35.180053 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5c6f877b5f-8gdbg" podStartSLOduration=3.180035682 podStartE2EDuration="3.180035682s" podCreationTimestamp="2026-01-30 18:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:35.173453019 +0000 UTC m=+1151.441831044" watchObservedRunningTime="2026-01-30 18:49:35.180035682 +0000 UTC m=+1151.448413707" Jan 30 18:49:35 crc kubenswrapper[4782]: I0130 18:49:35.543809 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.161679 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"68609276-cd5e-43d1-bef5-c79ef0628d5b","Type":"ContainerStarted","Data":"f1e30f711a9181fc1542b95372ef2ea33585f1f7da937ec2b40ebb2714403fb8"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.161724 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"68609276-cd5e-43d1-bef5-c79ef0628d5b","Type":"ContainerStarted","Data":"80ab42f12e2546a4116f4c63f356f6a2481f9f160b6a2ce0c533cf02eac70c65"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.161733 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"68609276-cd5e-43d1-bef5-c79ef0628d5b","Type":"ContainerStarted","Data":"762c37f3b251da1499d42f7f61fa62ae481dae527a72c686ffca0504cf0a010f"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.162941 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.163338 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="68609276-cd5e-43d1-bef5-c79ef0628d5b" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.167:9322/\": dial tcp 10.217.0.167:9322: connect: connection refused" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.164927 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f7fq6" event={"ID":"47aa0756-718b-4d1c-bef5-318895ee6c90","Type":"ContainerStarted","Data":"e0d6b134827799f724f3cf412addb6b1b4a4ef4863d14aad089a1325736c2ff4"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.166650 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mpvf8" event={"ID":"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e","Type":"ContainerStarted","Data":"3a215ed02134533ced39cc9b6b783ff46122b85767c17dfaf3a6c8155209cfa7"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.169243 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerStarted","Data":"afe1cbacb9d3cf97a4983e60b1791188f97e69195df5975b0ded4b56cd6e4801"} Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.187054 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.187036243 podStartE2EDuration="2.187036243s" podCreationTimestamp="2026-01-30 18:49:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:36.182667765 +0000 UTC m=+1152.451045790" watchObservedRunningTime="2026-01-30 18:49:36.187036243 +0000 UTC m=+1152.455414268" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.206598 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mpvf8" podStartSLOduration=3.3687242680000002 podStartE2EDuration="1m15.206576536s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="2026-01-30 18:48:23.219256334 +0000 UTC m=+1079.487634359" lastFinishedPulling="2026-01-30 18:49:35.057108602 +0000 UTC m=+1151.325486627" observedRunningTime="2026-01-30 18:49:36.199000529 +0000 UTC m=+1152.467378564" watchObservedRunningTime="2026-01-30 18:49:36.206576536 +0000 UTC m=+1152.474954561" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.223956 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-f7fq6" podStartSLOduration=4.150870568 podStartE2EDuration="1m15.223932505s" podCreationTimestamp="2026-01-30 18:48:21 +0000 UTC" firstStartedPulling="2026-01-30 18:48:23.838988888 +0000 UTC m=+1080.107366913" lastFinishedPulling="2026-01-30 18:49:34.912050825 +0000 UTC m=+1151.180428850" observedRunningTime="2026-01-30 18:49:36.219148987 +0000 UTC m=+1152.487527012" watchObservedRunningTime="2026-01-30 18:49:36.223932505 +0000 UTC m=+1152.492310540" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.425426 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125b8d39-013e-4bcf-90cb-2612455df4fb" path="/var/lib/kubelet/pods/125b8d39-013e-4bcf-90cb-2612455df4fb/volumes" Jan 30 18:49:36 crc kubenswrapper[4782]: E0130 18:49:36.782383 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:36 crc kubenswrapper[4782]: E0130 18:49:36.789361 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:36 crc kubenswrapper[4782]: E0130 18:49:36.790745 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:36 crc kubenswrapper[4782]: E0130 18:49:36.790799 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:36 crc kubenswrapper[4782]: I0130 18:49:36.986289 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.179254 4782 generic.go:334] "Generic (PLEG): container finished" podID="c0087b5d-ef94-4433-9e0e-23b509dd3003" containerID="961c540b3814e2a30632333589d5245e623a044016634d43078e0baed9bcfd9a" exitCode=0 Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.179319 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p9dwp" event={"ID":"c0087b5d-ef94-4433-9e0e-23b509dd3003","Type":"ContainerDied","Data":"961c540b3814e2a30632333589d5245e623a044016634d43078e0baed9bcfd9a"} Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.181907 4782 generic.go:334] "Generic (PLEG): container finished" podID="627f3bc8-cc79-4500-81d4-4dc041b88394" containerID="be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4" exitCode=1 Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183265 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183754 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data\") pod \"627f3bc8-cc79-4500-81d4-4dc041b88394\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183810 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle\") pod \"627f3bc8-cc79-4500-81d4-4dc041b88394\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183840 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzpx5\" (UniqueName: \"kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5\") pod \"627f3bc8-cc79-4500-81d4-4dc041b88394\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183890 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca\") pod \"627f3bc8-cc79-4500-81d4-4dc041b88394\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183951 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs\") pod \"627f3bc8-cc79-4500-81d4-4dc041b88394\" (UID: \"627f3bc8-cc79-4500-81d4-4dc041b88394\") " Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.183267 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"627f3bc8-cc79-4500-81d4-4dc041b88394","Type":"ContainerDied","Data":"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4"} Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.184737 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"627f3bc8-cc79-4500-81d4-4dc041b88394","Type":"ContainerDied","Data":"0376846e9068aaeba1b63273b3d13d482a8620b17252f42a5236b30d6b0e0bd3"} Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.184957 4782 scope.go:117] "RemoveContainer" containerID="be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.185006 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs" (OuterVolumeSpecName: "logs") pod "627f3bc8-cc79-4500-81d4-4dc041b88394" (UID: "627f3bc8-cc79-4500-81d4-4dc041b88394"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.199544 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5" (OuterVolumeSpecName: "kube-api-access-nzpx5") pod "627f3bc8-cc79-4500-81d4-4dc041b88394" (UID: "627f3bc8-cc79-4500-81d4-4dc041b88394"). InnerVolumeSpecName "kube-api-access-nzpx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.225444 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "627f3bc8-cc79-4500-81d4-4dc041b88394" (UID: "627f3bc8-cc79-4500-81d4-4dc041b88394"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.271456 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "627f3bc8-cc79-4500-81d4-4dc041b88394" (UID: "627f3bc8-cc79-4500-81d4-4dc041b88394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.281727 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data" (OuterVolumeSpecName: "config-data") pod "627f3bc8-cc79-4500-81d4-4dc041b88394" (UID: "627f3bc8-cc79-4500-81d4-4dc041b88394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.285790 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.285819 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.285832 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzpx5\" (UniqueName: \"kubernetes.io/projected/627f3bc8-cc79-4500-81d4-4dc041b88394-kube-api-access-nzpx5\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.285841 4782 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/627f3bc8-cc79-4500-81d4-4dc041b88394-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.285849 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627f3bc8-cc79-4500-81d4-4dc041b88394-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.335755 4782 scope.go:117] "RemoveContainer" containerID="be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4" Jan 30 18:49:37 crc kubenswrapper[4782]: E0130 18:49:37.336195 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4\": container with ID starting with be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4 not found: ID does not exist" containerID="be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.336237 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4"} err="failed to get container status \"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4\": rpc error: code = NotFound desc = could not find container \"be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4\": container with ID starting with be272e3a5ee276099d00bc2b2fcb9b0e7a95b1ac06db18611b72a196972399f4 not found: ID does not exist" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.517563 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.524275 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.550498 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:37 crc kubenswrapper[4782]: E0130 18:49:37.551061 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f3bc8-cc79-4500-81d4-4dc041b88394" containerName="watcher-decision-engine" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.551089 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f3bc8-cc79-4500-81d4-4dc041b88394" containerName="watcher-decision-engine" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.551525 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="627f3bc8-cc79-4500-81d4-4dc041b88394" containerName="watcher-decision-engine" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.552444 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.556404 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.558601 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.694944 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sqh6\" (UniqueName: \"kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.695029 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.695067 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.695095 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.695145 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.796949 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.797054 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.797120 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.797219 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.797305 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sqh6\" (UniqueName: \"kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.800327 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.804882 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.815108 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.824075 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.834647 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sqh6\" (UniqueName: \"kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6\") pod \"watcher-decision-engine-0\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:49:37 crc kubenswrapper[4782]: I0130 18:49:37.886906 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.346218 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:49:38 crc kubenswrapper[4782]: W0130 18:49:38.371376 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode82abe4a_d9ad_47dd_bd5c_2704052ba388.slice/crio-c808a9cd1f3fedf1a620ec21ee0158757cbbc20f2759739edd5b1eb25b581b48 WatchSource:0}: Error finding container c808a9cd1f3fedf1a620ec21ee0158757cbbc20f2759739edd5b1eb25b581b48: Status 404 returned error can't find the container with id c808a9cd1f3fedf1a620ec21ee0158757cbbc20f2759739edd5b1eb25b581b48 Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.434015 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627f3bc8-cc79-4500-81d4-4dc041b88394" path="/var/lib/kubelet/pods/627f3bc8-cc79-4500-81d4-4dc041b88394/volumes" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.546610 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p9dwp" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.611023 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle\") pod \"c0087b5d-ef94-4433-9e0e-23b509dd3003\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.611104 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl77v\" (UniqueName: \"kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v\") pod \"c0087b5d-ef94-4433-9e0e-23b509dd3003\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.611155 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data\") pod \"c0087b5d-ef94-4433-9e0e-23b509dd3003\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.611241 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts\") pod \"c0087b5d-ef94-4433-9e0e-23b509dd3003\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.611333 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs\") pod \"c0087b5d-ef94-4433-9e0e-23b509dd3003\" (UID: \"c0087b5d-ef94-4433-9e0e-23b509dd3003\") " Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.612223 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs" (OuterVolumeSpecName: "logs") pod "c0087b5d-ef94-4433-9e0e-23b509dd3003" (UID: "c0087b5d-ef94-4433-9e0e-23b509dd3003"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.615957 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v" (OuterVolumeSpecName: "kube-api-access-pl77v") pod "c0087b5d-ef94-4433-9e0e-23b509dd3003" (UID: "c0087b5d-ef94-4433-9e0e-23b509dd3003"). InnerVolumeSpecName "kube-api-access-pl77v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.622379 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts" (OuterVolumeSpecName: "scripts") pod "c0087b5d-ef94-4433-9e0e-23b509dd3003" (UID: "c0087b5d-ef94-4433-9e0e-23b509dd3003"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.636486 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data" (OuterVolumeSpecName: "config-data") pod "c0087b5d-ef94-4433-9e0e-23b509dd3003" (UID: "c0087b5d-ef94-4433-9e0e-23b509dd3003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.637026 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0087b5d-ef94-4433-9e0e-23b509dd3003" (UID: "c0087b5d-ef94-4433-9e0e-23b509dd3003"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.713889 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.713927 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0087b5d-ef94-4433-9e0e-23b509dd3003-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.713940 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.713954 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl77v\" (UniqueName: \"kubernetes.io/projected/c0087b5d-ef94-4433-9e0e-23b509dd3003-kube-api-access-pl77v\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:38 crc kubenswrapper[4782]: I0130 18:49:38.713969 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0087b5d-ef94-4433-9e0e-23b509dd3003-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.203611 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-p9dwp" event={"ID":"c0087b5d-ef94-4433-9e0e-23b509dd3003","Type":"ContainerDied","Data":"90c7ce74ee5af845fea5c9b9b2ea323fef7eef6e7c0b39cb26ec3e381c7a71f5"} Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.203647 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c7ce74ee5af845fea5c9b9b2ea323fef7eef6e7c0b39cb26ec3e381c7a71f5" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.203709 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-p9dwp" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.210155 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerStarted","Data":"6deb95f164b18c87f9a7eba4290f9853902b0390e59e003df926b6b9aa3015c8"} Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.210290 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerStarted","Data":"c808a9cd1f3fedf1a620ec21ee0158757cbbc20f2759739edd5b1eb25b581b48"} Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.250482 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.250463314 podStartE2EDuration="2.250463314s" podCreationTimestamp="2026-01-30 18:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:39.243540973 +0000 UTC m=+1155.511918998" watchObservedRunningTime="2026-01-30 18:49:39.250463314 +0000 UTC m=+1155.518841339" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.387957 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6889757c94-v7jr9"] Jan 30 18:49:39 crc kubenswrapper[4782]: E0130 18:49:39.388553 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0087b5d-ef94-4433-9e0e-23b509dd3003" containerName="placement-db-sync" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.388573 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0087b5d-ef94-4433-9e0e-23b509dd3003" containerName="placement-db-sync" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.389092 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0087b5d-ef94-4433-9e0e-23b509dd3003" containerName="placement-db-sync" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.390672 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.394025 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.394417 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zdhpz" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.395093 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.395356 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.395569 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.421117 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6889757c94-v7jr9"] Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528060 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8sj6\" (UniqueName: \"kubernetes.io/projected/da3c4b41-c384-4983-a704-e63d44f1fed9-kube-api-access-r8sj6\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528121 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-internal-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528146 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-public-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528775 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-combined-ca-bundle\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528817 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-scripts\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528885 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da3c4b41-c384-4983-a704-e63d44f1fed9-logs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.528919 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-config-data\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.596291 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631347 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8sj6\" (UniqueName: \"kubernetes.io/projected/da3c4b41-c384-4983-a704-e63d44f1fed9-kube-api-access-r8sj6\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631413 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-internal-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631444 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-public-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631471 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-combined-ca-bundle\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631493 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-scripts\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631536 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da3c4b41-c384-4983-a704-e63d44f1fed9-logs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631562 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-config-data\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.631983 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da3c4b41-c384-4983-a704-e63d44f1fed9-logs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.635970 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-scripts\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.636042 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-public-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.636710 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-combined-ca-bundle\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.639628 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-internal-tls-certs\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.649619 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3c4b41-c384-4983-a704-e63d44f1fed9-config-data\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.649960 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8sj6\" (UniqueName: \"kubernetes.io/projected/da3c4b41-c384-4983-a704-e63d44f1fed9-kube-api-access-r8sj6\") pod \"placement-6889757c94-v7jr9\" (UID: \"da3c4b41-c384-4983-a704-e63d44f1fed9\") " pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.730840 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:39 crc kubenswrapper[4782]: I0130 18:49:39.853995 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Jan 30 18:49:41 crc kubenswrapper[4782]: E0130 18:49:41.782948 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:41 crc kubenswrapper[4782]: E0130 18:49:41.785501 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:41 crc kubenswrapper[4782]: E0130 18:49:41.787545 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:41 crc kubenswrapper[4782]: E0130 18:49:41.787607 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:41 crc kubenswrapper[4782]: I0130 18:49:41.893197 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:49:42 crc kubenswrapper[4782]: I0130 18:49:42.137605 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:49:42 crc kubenswrapper[4782]: I0130 18:49:42.243668 4782 generic.go:334] "Generic (PLEG): container finished" podID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerID="6deb95f164b18c87f9a7eba4290f9853902b0390e59e003df926b6b9aa3015c8" exitCode=1 Jan 30 18:49:42 crc kubenswrapper[4782]: I0130 18:49:42.243743 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerDied","Data":"6deb95f164b18c87f9a7eba4290f9853902b0390e59e003df926b6b9aa3015c8"} Jan 30 18:49:42 crc kubenswrapper[4782]: I0130 18:49:42.244612 4782 scope.go:117] "RemoveContainer" containerID="6deb95f164b18c87f9a7eba4290f9853902b0390e59e003df926b6b9aa3015c8" Jan 30 18:49:43 crc kubenswrapper[4782]: I0130 18:49:43.254085 4782 generic.go:334] "Generic (PLEG): container finished" podID="47aa0756-718b-4d1c-bef5-318895ee6c90" containerID="e0d6b134827799f724f3cf412addb6b1b4a4ef4863d14aad089a1325736c2ff4" exitCode=0 Jan 30 18:49:43 crc kubenswrapper[4782]: I0130 18:49:43.254187 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f7fq6" event={"ID":"47aa0756-718b-4d1c-bef5-318895ee6c90","Type":"ContainerDied","Data":"e0d6b134827799f724f3cf412addb6b1b4a4ef4863d14aad089a1325736c2ff4"} Jan 30 18:49:43 crc kubenswrapper[4782]: I0130 18:49:43.653702 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:49:43 crc kubenswrapper[4782]: I0130 18:49:43.794739 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d67b5c94d-pwj69" Jan 30 18:49:43 crc kubenswrapper[4782]: I0130 18:49:43.861771 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.278054 4782 generic.go:334] "Generic (PLEG): container finished" podID="63ffaa09-371e-4549-a56f-11d6734ff40e" containerID="ac7b4bd8674c634ebaaebc3c3cacd267e39e6541b520fc04a9b6032720f3ce1b" exitCode=0 Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.278531 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q8pml" event={"ID":"63ffaa09-371e-4549-a56f-11d6734ff40e","Type":"ContainerDied","Data":"ac7b4bd8674c634ebaaebc3c3cacd267e39e6541b520fc04a9b6032720f3ce1b"} Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.279403 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon-log" containerID="cri-o://441cb98b7bdcb9129ff7e175460fd76e74ce990bde17ddae86c8e1c5a95f2c84" gracePeriod=30 Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.279599 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" containerID="cri-o://721c73bb806b44f16ca65b0e3d3bdf2b72298ad7af4c050b8fb757c30b1fb414" gracePeriod=30 Jan 30 18:49:44 crc kubenswrapper[4782]: E0130 18:49:44.348200 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.539960 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6889757c94-v7jr9"] Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.589994 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.746543 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqb9b\" (UniqueName: \"kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b\") pod \"47aa0756-718b-4d1c-bef5-318895ee6c90\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.746604 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data\") pod \"47aa0756-718b-4d1c-bef5-318895ee6c90\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.746672 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle\") pod \"47aa0756-718b-4d1c-bef5-318895ee6c90\" (UID: \"47aa0756-718b-4d1c-bef5-318895ee6c90\") " Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.750159 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b" (OuterVolumeSpecName: "kube-api-access-zqb9b") pod "47aa0756-718b-4d1c-bef5-318895ee6c90" (UID: "47aa0756-718b-4d1c-bef5-318895ee6c90"). InnerVolumeSpecName "kube-api-access-zqb9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.754398 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "47aa0756-718b-4d1c-bef5-318895ee6c90" (UID: "47aa0756-718b-4d1c-bef5-318895ee6c90"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.777335 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47aa0756-718b-4d1c-bef5-318895ee6c90" (UID: "47aa0756-718b-4d1c-bef5-318895ee6c90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.849583 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqb9b\" (UniqueName: \"kubernetes.io/projected/47aa0756-718b-4d1c-bef5-318895ee6c90-kube-api-access-zqb9b\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.849611 4782 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.849620 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47aa0756-718b-4d1c-bef5-318895ee6c90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.854693 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Jan 30 18:49:44 crc kubenswrapper[4782]: I0130 18:49:44.873539 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.290603 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6889757c94-v7jr9" event={"ID":"da3c4b41-c384-4983-a704-e63d44f1fed9","Type":"ContainerStarted","Data":"38770adaa97c2c202e8a04fba7c05e4f74cb032d25e8c847594c3c5f62e01312"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.292218 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6889757c94-v7jr9" event={"ID":"da3c4b41-c384-4983-a704-e63d44f1fed9","Type":"ContainerStarted","Data":"ecfca328ec9dd6289ac63bd28f5c22cdfc62af909130c9a97e99cf48ed5144fa"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.292384 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.292409 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.292426 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6889757c94-v7jr9" event={"ID":"da3c4b41-c384-4983-a704-e63d44f1fed9","Type":"ContainerStarted","Data":"373963560b915997c2c957da347ac41f14b1718a60c9b9f1c9477e125dd77064"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.295395 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerStarted","Data":"36ffd1f00aaad9e52df319c9e4a0988eb54569878c08d7921351da4fbb56face"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.295651 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="ceilometer-notification-agent" containerID="cri-o://998a8f9c6850a8020aeea9b233da454493113420844d9bbcb36d0ba8317f16f3" gracePeriod=30 Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.296101 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.296188 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="proxy-httpd" containerID="cri-o://36ffd1f00aaad9e52df319c9e4a0988eb54569878c08d7921351da4fbb56face" gracePeriod=30 Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.296304 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="sg-core" containerID="cri-o://afe1cbacb9d3cf97a4983e60b1791188f97e69195df5975b0ded4b56cd6e4801" gracePeriod=30 Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.300279 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerStarted","Data":"36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.304790 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-f7fq6" event={"ID":"47aa0756-718b-4d1c-bef5-318895ee6c90","Type":"ContainerDied","Data":"5ce6a51b97e0f301f087c16079800082fc48fc8ecc3725fe5dc7655905990dd0"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.304832 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ce6a51b97e0f301f087c16079800082fc48fc8ecc3725fe5dc7655905990dd0" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.304876 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-f7fq6" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.306420 4782 generic.go:334] "Generic (PLEG): container finished" podID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" containerID="3a215ed02134533ced39cc9b6b783ff46122b85767c17dfaf3a6c8155209cfa7" exitCode=0 Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.306471 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mpvf8" event={"ID":"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e","Type":"ContainerDied","Data":"3a215ed02134533ced39cc9b6b783ff46122b85767c17dfaf3a6c8155209cfa7"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.308024 4782 generic.go:334] "Generic (PLEG): container finished" podID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerID="721c73bb806b44f16ca65b0e3d3bdf2b72298ad7af4c050b8fb757c30b1fb414" exitCode=0 Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.308769 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerDied","Data":"721c73bb806b44f16ca65b0e3d3bdf2b72298ad7af4c050b8fb757c30b1fb414"} Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.316613 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6889757c94-v7jr9" podStartSLOduration=6.316593216 podStartE2EDuration="6.316593216s" podCreationTimestamp="2026-01-30 18:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:45.314871263 +0000 UTC m=+1161.583249288" watchObservedRunningTime="2026-01-30 18:49:45.316593216 +0000 UTC m=+1161.584971231" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.327673 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.563284 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d84b8b585-bfbrv"] Jan 30 18:49:45 crc kubenswrapper[4782]: E0130 18:49:45.563698 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" containerName="barbican-db-sync" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.563713 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" containerName="barbican-db-sync" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.563904 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" containerName="barbican-db-sync" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.564853 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.566428 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69354d0f-b465-419f-8fd1-b812a39312c5-logs\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.566480 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data-custom\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.566500 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-combined-ca-bundle\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.566553 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.566682 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxbc\" (UniqueName: \"kubernetes.io/projected/69354d0f-b465-419f-8fd1-b812a39312c5-kube-api-access-kdxbc\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.567753 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t8tvj" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.567964 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.574160 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.581254 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-d6457597d-9bs7l"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.582716 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.588511 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.612874 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d84b8b585-bfbrv"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.630341 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d6457597d-9bs7l"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667402 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data-custom\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667435 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-combined-ca-bundle\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667483 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-combined-ca-bundle\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667516 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667539 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cn7\" (UniqueName: \"kubernetes.io/projected/0575e76f-c529-41f7-8b65-87ec77ec9614-kube-api-access-b2cn7\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667565 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxbc\" (UniqueName: \"kubernetes.io/projected/69354d0f-b465-419f-8fd1-b812a39312c5-kube-api-access-kdxbc\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667582 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data-custom\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667610 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0575e76f-c529-41f7-8b65-87ec77ec9614-logs\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667628 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.667670 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69354d0f-b465-419f-8fd1-b812a39312c5-logs\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.668074 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69354d0f-b465-419f-8fd1-b812a39312c5-logs\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.679287 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data-custom\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.679623 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.682039 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.682136 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.683715 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-config-data\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.696354 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69354d0f-b465-419f-8fd1-b812a39312c5-combined-ca-bundle\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.714851 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxbc\" (UniqueName: \"kubernetes.io/projected/69354d0f-b465-419f-8fd1-b812a39312c5-kube-api-access-kdxbc\") pod \"barbican-worker-d84b8b585-bfbrv\" (UID: \"69354d0f-b465-419f-8fd1-b812a39312c5\") " pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779483 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779525 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779582 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-combined-ca-bundle\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779604 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779624 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwldw\" (UniqueName: \"kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779672 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cn7\" (UniqueName: \"kubernetes.io/projected/0575e76f-c529-41f7-8b65-87ec77ec9614-kube-api-access-b2cn7\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779707 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data-custom\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779727 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779751 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779778 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0575e76f-c529-41f7-8b65-87ec77ec9614-logs\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.779797 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.785177 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.786543 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0575e76f-c529-41f7-8b65-87ec77ec9614-logs\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.788156 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-combined-ca-bundle\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.791046 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.792488 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.794199 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.796162 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q8pml" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.796777 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0575e76f-c529-41f7-8b65-87ec77ec9614-config-data-custom\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.811693 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.811748 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cn7\" (UniqueName: \"kubernetes.io/projected/0575e76f-c529-41f7-8b65-87ec77ec9614-kube-api-access-b2cn7\") pod \"barbican-keystone-listener-d6457597d-9bs7l\" (UID: \"0575e76f-c529-41f7-8b65-87ec77ec9614\") " pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881287 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzzts\" (UniqueName: \"kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts\") pod \"63ffaa09-371e-4549-a56f-11d6734ff40e\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881457 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config\") pod \"63ffaa09-371e-4549-a56f-11d6734ff40e\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881511 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle\") pod \"63ffaa09-371e-4549-a56f-11d6734ff40e\" (UID: \"63ffaa09-371e-4549-a56f-11d6734ff40e\") " Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881712 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881755 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881781 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881860 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.881922 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.885070 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts" (OuterVolumeSpecName: "kube-api-access-mzzts") pod "63ffaa09-371e-4549-a56f-11d6734ff40e" (UID: "63ffaa09-371e-4549-a56f-11d6734ff40e"). InnerVolumeSpecName "kube-api-access-mzzts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.886453 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.887728 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889334 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6272\" (UniqueName: \"kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889430 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889481 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889562 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889637 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889678 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwldw\" (UniqueName: \"kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.889818 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzzts\" (UniqueName: \"kubernetes.io/projected/63ffaa09-371e-4549-a56f-11d6734ff40e-kube-api-access-mzzts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.890518 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.891002 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.891294 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.901376 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d84b8b585-bfbrv" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.909049 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.910351 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwldw\" (UniqueName: \"kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw\") pod \"dnsmasq-dns-69dbb6c5cc-wjvh4\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.920464 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config" (OuterVolumeSpecName: "config") pod "63ffaa09-371e-4549-a56f-11d6734ff40e" (UID: "63ffaa09-371e-4549-a56f-11d6734ff40e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.931731 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63ffaa09-371e-4549-a56f-11d6734ff40e" (UID: "63ffaa09-371e-4549-a56f-11d6734ff40e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994451 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994573 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994609 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994644 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994695 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6272\" (UniqueName: \"kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994781 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.994793 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ffaa09-371e-4549-a56f-11d6734ff40e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:45 crc kubenswrapper[4782]: I0130 18:49:45.998710 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.008401 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.018731 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6272\" (UniqueName: \"kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.022902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.023474 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom\") pod \"barbican-api-cdd7f6f98-s4js9\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.049511 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.112668 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.331261 4782 generic.go:334] "Generic (PLEG): container finished" podID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerID="36ffd1f00aaad9e52df319c9e4a0988eb54569878c08d7921351da4fbb56face" exitCode=0 Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.331293 4782 generic.go:334] "Generic (PLEG): container finished" podID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerID="afe1cbacb9d3cf97a4983e60b1791188f97e69195df5975b0ded4b56cd6e4801" exitCode=2 Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.331329 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerDied","Data":"36ffd1f00aaad9e52df319c9e4a0988eb54569878c08d7921351da4fbb56face"} Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.331358 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerDied","Data":"afe1cbacb9d3cf97a4983e60b1791188f97e69195df5975b0ded4b56cd6e4801"} Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.335260 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q8pml" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.337241 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q8pml" event={"ID":"63ffaa09-371e-4549-a56f-11d6734ff40e","Type":"ContainerDied","Data":"37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057"} Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.337281 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b85200834c2d2a44022730c59b5114c124ba6fdcd976c7c8eb11758be77057" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.512821 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.542409 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:46 crc kubenswrapper[4782]: E0130 18:49:46.542850 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ffaa09-371e-4549-a56f-11d6734ff40e" containerName="neutron-db-sync" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.542863 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ffaa09-371e-4549-a56f-11d6734ff40e" containerName="neutron-db-sync" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.543072 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ffaa09-371e-4549-a56f-11d6734ff40e" containerName="neutron-db-sync" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.544030 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.563729 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.626202 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.627709 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.632450 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.634835 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.635020 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.635220 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.635714 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.636387 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hs2p\" (UniqueName: \"kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.636512 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.636649 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.636765 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.636922 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.641072 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nggnw" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.699308 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d84b8b585-bfbrv"] Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738597 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738664 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738695 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738736 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738766 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738790 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738813 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738830 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738875 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738895 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzzh\" (UniqueName: \"kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.738924 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hs2p\" (UniqueName: \"kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.739505 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.739869 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.740099 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.740410 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.741008 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.758688 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hs2p\" (UniqueName: \"kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p\") pod \"dnsmasq-dns-74d97c6f95-cktt6\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: E0130 18:49:46.802601 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:46 crc kubenswrapper[4782]: E0130 18:49:46.805398 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:46 crc kubenswrapper[4782]: E0130 18:49:46.809757 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:46 crc kubenswrapper[4782]: E0130 18:49:46.809797 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.840459 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.840516 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.840550 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.840599 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.840629 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzzh\" (UniqueName: \"kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.845026 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.845103 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.845610 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.861770 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzzh\" (UniqueName: \"kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.866176 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle\") pod \"neutron-64684dfb44-vvmcx\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.869928 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:46 crc kubenswrapper[4782]: I0130 18:49:46.948386 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.066627 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.090924 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d6457597d-9bs7l"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.105554 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:47 crc kubenswrapper[4782]: W0130 18:49:47.109357 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfff6e7d0_306c_4610_9754_43e0816a66fc.slice/crio-0d6d71128527e1fd2b3a80e884f057a4e87f7649ffd44515ffe4064f18f47e69 WatchSource:0}: Error finding container 0d6d71128527e1fd2b3a80e884f057a4e87f7649ffd44515ffe4064f18f47e69: Status 404 returned error can't find the container with id 0d6d71128527e1fd2b3a80e884f057a4e87f7649ffd44515ffe4064f18f47e69 Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.227807 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.360902 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vxdc\" (UniqueName: \"kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361043 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361080 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361150 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361221 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361257 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle\") pod \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\" (UID: \"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e\") " Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361421 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.361662 4782 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.369944 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.370164 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc" (OuterVolumeSpecName: "kube-api-access-9vxdc") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "kube-api-access-9vxdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.370340 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" event={"ID":"fff6e7d0-306c-4610-9754-43e0816a66fc","Type":"ContainerStarted","Data":"0d6d71128527e1fd2b3a80e884f057a4e87f7649ffd44515ffe4064f18f47e69"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.373544 4782 generic.go:334] "Generic (PLEG): container finished" podID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" containerID="3cb806c040408344fb39ee4e0454be212dc4c5bafa67da379f195fafca08c84b" exitCode=0 Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.373910 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bpwp5" event={"ID":"07adaf47-0b0c-46f9-bf42-fc02ffec87a4","Type":"ContainerDied","Data":"3cb806c040408344fb39ee4e0454be212dc4c5bafa67da379f195fafca08c84b"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.387016 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts" (OuterVolumeSpecName: "scripts") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.388890 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mpvf8" event={"ID":"a9ddf9ab-a21e-4d41-b795-c0c926e38a1e","Type":"ContainerDied","Data":"633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.389015 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="633e28b09f8def8c1d3f193b0fe796f80c1a4117196811f3c8fa6c1980637517" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.389163 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mpvf8" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.403421 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d84b8b585-bfbrv" event={"ID":"69354d0f-b465-419f-8fd1-b812a39312c5","Type":"ContainerStarted","Data":"5975b2fe42fda876a54877a340a54920b57592cb73225b2f8de0bb825ac85af5"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.412298 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerStarted","Data":"7e7ccfce383e337bbdf37f09639d15a161917f5de0df8d5b39ecd25b84950d11"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.422281 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" event={"ID":"0575e76f-c529-41f7-8b65-87ec77ec9614","Type":"ContainerStarted","Data":"ab3f4f080169d7f26ee5217506c72fd84d24a1e15978e1851219b9dae8260a94"} Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.463912 4782 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.464131 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.464194 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vxdc\" (UniqueName: \"kubernetes.io/projected/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-kube-api-access-9vxdc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.470245 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.473784 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.529117 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:49:47 crc kubenswrapper[4782]: E0130 18:49:47.529536 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" containerName="cinder-db-sync" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.529551 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" containerName="cinder-db-sync" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.531701 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" containerName="cinder-db-sync" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.532673 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.536792 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.540331 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data" (OuterVolumeSpecName: "config-data") pod "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" (UID: "a9ddf9ab-a21e-4d41-b795-c0c926e38a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.566517 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.566547 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.579344 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.629093 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.664821 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667696 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667758 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667844 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkvq\" (UniqueName: \"kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667883 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667922 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.667971 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.672922 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.674246 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770017 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770061 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770107 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvpf2\" (UniqueName: \"kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770135 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770157 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770181 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770204 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770241 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770311 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkvq\" (UniqueName: \"kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770331 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770355 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.770372 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.773394 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.777648 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.781489 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.782145 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.789986 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.796479 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkvq\" (UniqueName: \"kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq\") pod \"cinder-scheduler-0\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.840958 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.842910 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.852162 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.852511 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.862600 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.872497 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.872580 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.872733 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.872790 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.874351 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.874446 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvpf2\" (UniqueName: \"kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.875711 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.875768 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.876546 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.876728 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.878465 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.887818 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.892441 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvpf2\" (UniqueName: \"kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2\") pod \"dnsmasq-dns-556f5c66f5-rp6pr\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:47 crc kubenswrapper[4782]: I0130 18:49:47.929879 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.005987 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.005222 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.008066 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.008114 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.008180 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvgmx\" (UniqueName: \"kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.020600 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.020684 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.020738 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.121770 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122021 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122041 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvgmx\" (UniqueName: \"kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122086 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122127 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122160 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122214 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122456 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.122791 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.127487 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.127915 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.128693 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.130714 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.135403 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.140080 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvgmx\" (UniqueName: \"kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx\") pod \"cinder-api-0\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.173299 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.433089 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerStarted","Data":"64951b851656ef91c290bdff5f1f904ce39b1113ebe95dc7d8f6c075b5322eaa"} Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.435451 4782 generic.go:334] "Generic (PLEG): container finished" podID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" exitCode=1 Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.435600 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerDied","Data":"36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3"} Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.435704 4782 scope.go:117] "RemoveContainer" containerID="6deb95f164b18c87f9a7eba4290f9853902b0390e59e003df926b6b9aa3015c8" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.436175 4782 scope.go:117] "RemoveContainer" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" Jan 30 18:49:48 crc kubenswrapper[4782]: E0130 18:49:48.436476 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(e82abe4a-d9ad-47dd-bd5c-2704052ba388)\"" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.437645 4782 generic.go:334] "Generic (PLEG): container finished" podID="fff6e7d0-306c-4610-9754-43e0816a66fc" containerID="15384a0260e77d39c351d2a3cc58f3ee976bf652b0b1367114e391d73fa339e8" exitCode=0 Jan 30 18:49:48 crc kubenswrapper[4782]: I0130 18:49:48.437861 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" event={"ID":"fff6e7d0-306c-4610-9754-43e0816a66fc","Type":"ContainerDied","Data":"15384a0260e77d39c351d2a3cc58f3ee976bf652b0b1367114e391d73fa339e8"} Jan 30 18:49:49 crc kubenswrapper[4782]: W0130 18:49:49.380735 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba68576_0707_479a_9c66_5731c32c9085.slice/crio-4b22bcbd9122863f520ffff7128c5dfdb671dcc70850a50020484019ac28c720 WatchSource:0}: Error finding container 4b22bcbd9122863f520ffff7128c5dfdb671dcc70850a50020484019ac28c720: Status 404 returned error can't find the container with id 4b22bcbd9122863f520ffff7128c5dfdb671dcc70850a50020484019ac28c720 Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.447910 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" event={"ID":"5ba68576-0707-479a-9c66-5731c32c9085","Type":"ContainerStarted","Data":"4b22bcbd9122863f520ffff7128c5dfdb671dcc70850a50020484019ac28c720"} Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.451058 4782 generic.go:334] "Generic (PLEG): container finished" podID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerID="998a8f9c6850a8020aeea9b233da454493113420844d9bbcb36d0ba8317f16f3" exitCode=0 Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.451126 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerDied","Data":"998a8f9c6850a8020aeea9b233da454493113420844d9bbcb36d0ba8317f16f3"} Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.452743 4782 scope.go:117] "RemoveContainer" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" Jan 30 18:49:49 crc kubenswrapper[4782]: E0130 18:49:49.453271 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(e82abe4a-d9ad-47dd-bd5c-2704052ba388)\"" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" Jan 30 18:49:49 crc kubenswrapper[4782]: W0130 18:49:49.667570 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod076239f6_df06_4f9a_a0e6_70413767a0c9.slice/crio-931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634 WatchSource:0}: Error finding container 931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634: Status 404 returned error can't find the container with id 931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634 Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.901689 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bpwp5" Jan 30 18:49:49 crc kubenswrapper[4782]: I0130 18:49:49.931381 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.028588 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072121 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072190 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072243 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072281 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle\") pod \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072309 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7f88\" (UniqueName: \"kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88\") pod \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072347 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data\") pod \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072377 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072400 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072508 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data\") pod \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\" (UID: \"07adaf47-0b0c-46f9-bf42-fc02ffec87a4\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.072553 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwldw\" (UniqueName: \"kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw\") pod \"fff6e7d0-306c-4610-9754-43e0816a66fc\" (UID: \"fff6e7d0-306c-4610-9754-43e0816a66fc\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.087884 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw" (OuterVolumeSpecName: "kube-api-access-vwldw") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "kube-api-access-vwldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.092442 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07adaf47-0b0c-46f9-bf42-fc02ffec87a4" (UID: "07adaf47-0b0c-46f9-bf42-fc02ffec87a4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.092893 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88" (OuterVolumeSpecName: "kube-api-access-b7f88") pod "07adaf47-0b0c-46f9-bf42-fc02ffec87a4" (UID: "07adaf47-0b0c-46f9-bf42-fc02ffec87a4"). InnerVolumeSpecName "kube-api-access-b7f88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.108119 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.110263 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.123331 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.127345 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07adaf47-0b0c-46f9-bf42-fc02ffec87a4" (UID: "07adaf47-0b0c-46f9-bf42-fc02ffec87a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.136096 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.139779 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config" (OuterVolumeSpecName: "config") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.164956 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fff6e7d0-306c-4610-9754-43e0816a66fc" (UID: "fff6e7d0-306c-4610-9754-43e0816a66fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181348 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181376 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181444 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181489 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181509 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181553 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zqs\" (UniqueName: \"kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.181692 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data\") pod \"a13b1ca2-1722-425b-ae48-d34c99f746f6\" (UID: \"a13b1ca2-1722-425b-ae48-d34c99f746f6\") " Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182042 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7f88\" (UniqueName: \"kubernetes.io/projected/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-kube-api-access-b7f88\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182058 4782 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182067 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182075 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182084 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwldw\" (UniqueName: \"kubernetes.io/projected/fff6e7d0-306c-4610-9754-43e0816a66fc-kube-api-access-vwldw\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182093 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182102 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182111 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fff6e7d0-306c-4610-9754-43e0816a66fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182119 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182653 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.182816 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.187099 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs" (OuterVolumeSpecName: "kube-api-access-84zqs") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "kube-api-access-84zqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.191328 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts" (OuterVolumeSpecName: "scripts") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.203504 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data" (OuterVolumeSpecName: "config-data") pod "07adaf47-0b0c-46f9-bf42-fc02ffec87a4" (UID: "07adaf47-0b0c-46f9-bf42-fc02ffec87a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.283846 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.283880 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a13b1ca2-1722-425b-ae48-d34c99f746f6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.283889 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.283897 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07adaf47-0b0c-46f9-bf42-fc02ffec87a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.283905 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zqs\" (UniqueName: \"kubernetes.io/projected/a13b1ca2-1722-425b-ae48-d34c99f746f6-kube-api-access-84zqs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.381480 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.387945 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.403391 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:49:50 crc kubenswrapper[4782]: W0130 18:49:50.424636 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc143d7ec_7d96_4365_8e88_c2f41c55ebf7.slice/crio-2b70e41b8a2308e8f7732cb323228614011db35236c64f6b0353bf1997b71928 WatchSource:0}: Error finding container 2b70e41b8a2308e8f7732cb323228614011db35236c64f6b0353bf1997b71928: Status 404 returned error can't find the container with id 2b70e41b8a2308e8f7732cb323228614011db35236c64f6b0353bf1997b71928 Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.560419 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.560458 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data" (OuterVolumeSpecName: "config-data") pod "a13b1ca2-1722-425b-ae48-d34c99f746f6" (UID: "a13b1ca2-1722-425b-ae48-d34c99f746f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.562705 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bpwp5" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.593669 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.593759 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a13b1ca2-1722-425b-ae48-d34c99f746f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.612869 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" podUID="5ba68576-0707-479a-9c66-5731c32c9085" containerName="init" containerID="cri-o://a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f" gracePeriod=10 Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.712858 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.729741 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787616 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787661 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787674 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bpwp5" event={"ID":"07adaf47-0b0c-46f9-bf42-fc02ffec87a4","Type":"ContainerDied","Data":"11aad5f2e222830f361eb66056d34b6dcbb23a1114d60f97f0eb522e07453b61"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787701 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11aad5f2e222830f361eb66056d34b6dcbb23a1114d60f97f0eb522e07453b61" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787714 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787729 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787824 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" event={"ID":"5ba68576-0707-479a-9c66-5731c32c9085","Type":"ContainerStarted","Data":"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787854 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a13b1ca2-1722-425b-ae48-d34c99f746f6","Type":"ContainerDied","Data":"11aea3b826fe2c4213c49072055b289505f153e9bc3daf420a1d7c622ec9058b"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787872 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69dbb6c5cc-wjvh4" event={"ID":"fff6e7d0-306c-4610-9754-43e0816a66fc","Type":"ContainerDied","Data":"0d6d71128527e1fd2b3a80e884f057a4e87f7649ffd44515ffe4064f18f47e69"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787885 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerStarted","Data":"931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787896 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerStarted","Data":"2b70e41b8a2308e8f7732cb323228614011db35236c64f6b0353bf1997b71928"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787906 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d84b8b585-bfbrv" event={"ID":"69354d0f-b465-419f-8fd1-b812a39312c5","Type":"ContainerStarted","Data":"17b6f3d4c5b6b05c29e8ddd0cae8a46207ef758137e474bccb3dd5d1767b713b"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787927 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerStarted","Data":"5a0161df6f177b020d0463254982fbe7c36b18fb5c70d3da052e57c85fdae7ed"} Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.787945 4782 scope.go:117] "RemoveContainer" containerID="36ffd1f00aaad9e52df319c9e4a0988eb54569878c08d7921351da4fbb56face" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.813423 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cdd7f6f98-s4js9" podStartSLOduration=5.8133973789999995 podStartE2EDuration="5.813397379s" podCreationTimestamp="2026-01-30 18:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:50.798398758 +0000 UTC m=+1167.066776783" watchObservedRunningTime="2026-01-30 18:49:50.813397379 +0000 UTC m=+1167.081775404" Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.927681 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.965719 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:49:50 crc kubenswrapper[4782]: I0130 18:49:50.984763 4782 scope.go:117] "RemoveContainer" containerID="afe1cbacb9d3cf97a4983e60b1791188f97e69195df5975b0ded4b56cd6e4801" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.010547 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.022118 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69dbb6c5cc-wjvh4"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.036885 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.037408 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="proxy-httpd" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037424 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="proxy-httpd" Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.037435 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="ceilometer-notification-agent" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037441 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="ceilometer-notification-agent" Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.037454 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="sg-core" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037460 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="sg-core" Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.037469 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff6e7d0-306c-4610-9754-43e0816a66fc" containerName="init" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037474 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff6e7d0-306c-4610-9754-43e0816a66fc" containerName="init" Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.037494 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" containerName="glance-db-sync" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037500 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" containerName="glance-db-sync" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037712 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="ceilometer-notification-agent" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037730 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" containerName="glance-db-sync" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037744 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="proxy-httpd" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037759 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" containerName="sg-core" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.037779 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff6e7d0-306c-4610-9754-43e0816a66fc" containerName="init" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.039615 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.041485 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.050935 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.052020 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.127248 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwm5\" (UniqueName: \"kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.127331 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.129363 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.129564 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.129600 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.129651 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.129897 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.135479 4782 scope.go:117] "RemoveContainer" containerID="998a8f9c6850a8020aeea9b233da454493113420844d9bbcb36d0ba8317f16f3" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240440 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwm5\" (UniqueName: \"kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240515 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240548 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240581 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240600 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240617 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.240662 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.245565 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.247170 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.278165 4782 scope.go:117] "RemoveContainer" containerID="15384a0260e77d39c351d2a3cc58f3ee976bf652b0b1367114e391d73fa339e8" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.279902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.280936 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.283061 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwm5\" (UniqueName: \"kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.309646 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.309949 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data\") pod \"ceilometer-0\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.379651 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.401413 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.424715 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.426170 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.443732 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.446911 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.446959 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7cgt\" (UniqueName: \"kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.447029 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.447128 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.447160 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.447315 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.549360 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.549457 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.550420 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.550500 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.551064 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.551672 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7cgt\" (UniqueName: \"kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.551005 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.551620 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.550597 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.551774 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.552835 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.569759 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7cgt\" (UniqueName: \"kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt\") pod \"dnsmasq-dns-7b79bb54f5-2wxwg\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.770101 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.795304 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.809699 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.847726 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.859567 4782 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 30 18:49:51 crc kubenswrapper[4782]: E0130 18:49:51.859911 4782 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860479 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860532 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860581 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860626 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hs2p\" (UniqueName: \"kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860694 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.860720 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb\") pod \"5ba68576-0707-479a-9c66-5731c32c9085\" (UID: \"5ba68576-0707-479a-9c66-5731c32c9085\") " Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.874984 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" event={"ID":"0575e76f-c529-41f7-8b65-87ec77ec9614","Type":"ContainerStarted","Data":"634cbb1c2eb985b4ce9983b7bb454b91d8478efec20b78c668513153123d7d0e"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.887033 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerStarted","Data":"a873fad9675ee26d9360b345482b465a6a465f5d77368b767258ac24b5b72bf2"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.888081 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" event={"ID":"303f380e-9ed9-4b93-a654-6bff8df34a6d","Type":"ContainerStarted","Data":"0c5db487d3ff84bb4b5a8c500a91b90fd4641f982d34d86066f2af9db5f370bd"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.891515 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerStarted","Data":"29dc8d8dfe511aab5198ec37a7f525f9a5d3be9f61db4b7122de4421ec08c8c0"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.891915 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p" (OuterVolumeSpecName: "kube-api-access-5hs2p") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "kube-api-access-5hs2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.920535 4782 generic.go:334] "Generic (PLEG): container finished" podID="5ba68576-0707-479a-9c66-5731c32c9085" containerID="a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f" exitCode=0 Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.920602 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" event={"ID":"5ba68576-0707-479a-9c66-5731c32c9085","Type":"ContainerDied","Data":"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.920630 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" event={"ID":"5ba68576-0707-479a-9c66-5731c32c9085","Type":"ContainerDied","Data":"4b22bcbd9122863f520ffff7128c5dfdb671dcc70850a50020484019ac28c720"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.920646 4782 scope.go:117] "RemoveContainer" containerID="a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.920728 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74d97c6f95-cktt6" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.942771 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d84b8b585-bfbrv" event={"ID":"69354d0f-b465-419f-8fd1-b812a39312c5","Type":"ContainerStarted","Data":"2a8c321c5017af5cf01651d935097efd567767bf86afdbd10e6ef023c711dd4b"} Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.947615 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.963387 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hs2p\" (UniqueName: \"kubernetes.io/projected/5ba68576-0707-479a-9c66-5731c32c9085-kube-api-access-5hs2p\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.963593 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.967914 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.982037 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d84b8b585-bfbrv" podStartSLOduration=3.8095322080000003 podStartE2EDuration="6.982017376s" podCreationTimestamp="2026-01-30 18:49:45 +0000 UTC" firstStartedPulling="2026-01-30 18:49:46.711439347 +0000 UTC m=+1162.979817372" lastFinishedPulling="2026-01-30 18:49:49.883924505 +0000 UTC m=+1166.152302540" observedRunningTime="2026-01-30 18:49:51.955785657 +0000 UTC m=+1168.224163682" watchObservedRunningTime="2026-01-30 18:49:51.982017376 +0000 UTC m=+1168.250395401" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.982962 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config" (OuterVolumeSpecName: "config") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:51 crc kubenswrapper[4782]: I0130 18:49:51.987969 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.001238 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ba68576-0707-479a-9c66-5731c32c9085" (UID: "5ba68576-0707-479a-9c66-5731c32c9085"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.016263 4782 scope.go:117] "RemoveContainer" containerID="a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f" Jan 30 18:49:52 crc kubenswrapper[4782]: E0130 18:49:52.028654 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f\": container with ID starting with a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f not found: ID does not exist" containerID="a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.028732 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f"} err="failed to get container status \"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f\": rpc error: code = NotFound desc = could not find container \"a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f\": container with ID starting with a9e5c762e1f7ecb42edc6562cf3978cf99469e17454fda355aea194fa6e4451f not found: ID does not exist" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.065631 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.065657 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.065668 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.065677 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ba68576-0707-479a-9c66-5731c32c9085-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.185432 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:49:52 crc kubenswrapper[4782]: W0130 18:49:52.187323 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb357c566_0063_4e60_b284_9d4e8911734d.slice/crio-728d420c86f814ef192851b23b6445aaa96498e9a053f33a700f6554f613f4d4 WatchSource:0}: Error finding container 728d420c86f814ef192851b23b6445aaa96498e9a053f33a700f6554f613f4d4: Status 404 returned error can't find the container with id 728d420c86f814ef192851b23b6445aaa96498e9a053f33a700f6554f613f4d4 Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.336343 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.351295 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74d97c6f95-cktt6"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.382277 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:49:52 crc kubenswrapper[4782]: E0130 18:49:52.382679 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba68576-0707-479a-9c66-5731c32c9085" containerName="init" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.382690 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba68576-0707-479a-9c66-5731c32c9085" containerName="init" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.382846 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba68576-0707-479a-9c66-5731c32c9085" containerName="init" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.385079 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.389287 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.389498 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.389675 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rwwlx" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.397432 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.486332 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba68576-0707-479a-9c66-5731c32c9085" path="/var/lib/kubelet/pods/5ba68576-0707-479a-9c66-5731c32c9085/volumes" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.487261 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b1ca2-1722-425b-ae48-d34c99f746f6" path="/var/lib/kubelet/pods/a13b1ca2-1722-425b-ae48-d34c99f746f6/volumes" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.488054 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff6e7d0-306c-4610-9754-43e0816a66fc" path="/var/lib/kubelet/pods/fff6e7d0-306c-4610-9754-43e0816a66fc/volumes" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.489953 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.495712 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.499572 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.510142 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.515074 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593169 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593465 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9wv\" (UniqueName: \"kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593532 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593550 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593581 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593637 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.593652 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698190 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9wv\" (UniqueName: \"kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698286 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698335 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698357 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698380 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698400 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698441 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698464 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c92pn\" (UniqueName: \"kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698490 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698504 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698543 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698556 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.698595 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.699028 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.699583 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.700572 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.707843 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.710714 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.722806 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9wv\" (UniqueName: \"kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.723677 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.780955 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800709 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800763 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c92pn\" (UniqueName: \"kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800807 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800821 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800926 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800947 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.800985 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.801138 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.801346 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.801718 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.806550 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.807954 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.808585 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.829616 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c92pn\" (UniqueName: \"kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.844474 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.962260 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.967526 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerStarted","Data":"728372e88f1f143b41acf904b9c9e1c86e75ca1dca5acbea24d0b4ec4389aa37"} Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.974432 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" event={"ID":"0575e76f-c529-41f7-8b65-87ec77ec9614","Type":"ContainerStarted","Data":"a90ec68a2e50f0c04f4fd720ed306218475941e90fe1cccf811a39872804beab"} Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.977604 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.993084 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-d6457597d-9bs7l" podStartSLOduration=5.185702478 podStartE2EDuration="7.993062607s" podCreationTimestamp="2026-01-30 18:49:45 +0000 UTC" firstStartedPulling="2026-01-30 18:49:47.073043019 +0000 UTC m=+1163.341421034" lastFinishedPulling="2026-01-30 18:49:49.880403138 +0000 UTC m=+1166.148781163" observedRunningTime="2026-01-30 18:49:52.988943285 +0000 UTC m=+1169.257321310" watchObservedRunningTime="2026-01-30 18:49:52.993062607 +0000 UTC m=+1169.261440632" Jan 30 18:49:52 crc kubenswrapper[4782]: I0130 18:49:52.993875 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerStarted","Data":"728d420c86f814ef192851b23b6445aaa96498e9a053f33a700f6554f613f4d4"} Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.001415 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" event={"ID":"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5","Type":"ContainerStarted","Data":"7705a502c2ec01a5712891bc8467e5e87c5727bc7c46e19ff2a77e3e962d8bbe"} Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.006417 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerStarted","Data":"9454f90cec4c047dd7f4d13b8551aa39d63486a6d2cdfa864f0b839315136218"} Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.007678 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.010175 4782 generic.go:334] "Generic (PLEG): container finished" podID="303f380e-9ed9-4b93-a654-6bff8df34a6d" containerID="8f4654ed2c4ac81764d88e9d557fb850ebcbe5a6f8e1c5fb20784962ed66bd79" exitCode=0 Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.010256 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" event={"ID":"303f380e-9ed9-4b93-a654-6bff8df34a6d","Type":"ContainerDied","Data":"8f4654ed2c4ac81764d88e9d557fb850ebcbe5a6f8e1c5fb20784962ed66bd79"} Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.029071 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerStarted","Data":"47991c21381fd580ecc306b0e7358ab9e8a65c7e98ec6062c9a5e8a145fe5bb6"} Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.034654 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64684dfb44-vvmcx" podStartSLOduration=7.034640065 podStartE2EDuration="7.034640065s" podCreationTimestamp="2026-01-30 18:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:53.032913883 +0000 UTC m=+1169.301291908" watchObservedRunningTime="2026-01-30 18:49:53.034640065 +0000 UTC m=+1169.303018090" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.686515 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.686571 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831135 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831183 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvpf2\" (UniqueName: \"kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831287 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831396 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831459 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.831490 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc\") pod \"303f380e-9ed9-4b93-a654-6bff8df34a6d\" (UID: \"303f380e-9ed9-4b93-a654-6bff8df34a6d\") " Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.849256 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2" (OuterVolumeSpecName: "kube-api-access-tvpf2") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "kube-api-access-tvpf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.863197 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.871415 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.896177 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.896533 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config" (OuterVolumeSpecName: "config") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.933114 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.938170 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.938189 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.938198 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.938208 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvpf2\" (UniqueName: \"kubernetes.io/projected/303f380e-9ed9-4b93-a654-6bff8df34a6d-kube-api-access-tvpf2\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.938216 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:53 crc kubenswrapper[4782]: I0130 18:49:53.949842 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "303f380e-9ed9-4b93-a654-6bff8df34a6d" (UID: "303f380e-9ed9-4b93-a654-6bff8df34a6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.041743 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/303f380e-9ed9-4b93-a654-6bff8df34a6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.068438 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerStarted","Data":"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.068731 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.068749 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerStarted","Data":"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.087827 4782 generic.go:334] "Generic (PLEG): container finished" podID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerID="7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe" exitCode=0 Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.087903 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" event={"ID":"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5","Type":"ContainerDied","Data":"7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.101716 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" event={"ID":"303f380e-9ed9-4b93-a654-6bff8df34a6d","Type":"ContainerDied","Data":"0c5db487d3ff84bb4b5a8c500a91b90fd4641f982d34d86066f2af9db5f370bd"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.101759 4782 scope.go:117] "RemoveContainer" containerID="8f4654ed2c4ac81764d88e9d557fb850ebcbe5a6f8e1c5fb20784962ed66bd79" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.101862 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-556f5c66f5-rp6pr" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.128258 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerStarted","Data":"8975f5c42ddc8a34bfbead8ec48bb2d66f9e11e55926ab3f2361adedc8f122f9"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.142410 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerStarted","Data":"3a14d2040f8c9fffb42b4ae1db2b12d191389741b3b6ce4726e33ea27a308886"} Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.275534 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.298308 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-556f5c66f5-rp6pr"] Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.336462 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.336435846 podStartE2EDuration="7.336435846s" podCreationTimestamp="2026-01-30 18:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:54.226047216 +0000 UTC m=+1170.494425241" watchObservedRunningTime="2026-01-30 18:49:54.336435846 +0000 UTC m=+1170.604813871" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.442073 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303f380e-9ed9-4b93-a654-6bff8df34a6d" path="/var/lib/kubelet/pods/303f380e-9ed9-4b93-a654-6bff8df34a6d/volumes" Jan 30 18:49:54 crc kubenswrapper[4782]: E0130 18:49:54.493900 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db1b143_b7ce_4cc4_8412_6e3402508e98.slice/crio-89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db1b143_b7ce_4cc4_8412_6e3402508e98.slice/crio-conmon-89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303f380e_9ed9_4b93_a654_6bff8df34a6d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303f380e_9ed9_4b93_a654_6bff8df34a6d.slice/crio-0c5db487d3ff84bb4b5a8c500a91b90fd4641f982d34d86066f2af9db5f370bd\": RecentStats: unable to find data in memory cache]" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.870974 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.968527 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmkk\" (UniqueName: \"kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk\") pod \"2db1b143-b7ce-4cc4-8412-6e3402508e98\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.968683 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data\") pod \"2db1b143-b7ce-4cc4-8412-6e3402508e98\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.968729 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle\") pod \"2db1b143-b7ce-4cc4-8412-6e3402508e98\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.968794 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs\") pod \"2db1b143-b7ce-4cc4-8412-6e3402508e98\" (UID: \"2db1b143-b7ce-4cc4-8412-6e3402508e98\") " Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.970434 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs" (OuterVolumeSpecName: "logs") pod "2db1b143-b7ce-4cc4-8412-6e3402508e98" (UID: "2db1b143-b7ce-4cc4-8412-6e3402508e98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:54 crc kubenswrapper[4782]: I0130 18:49:54.975399 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk" (OuterVolumeSpecName: "kube-api-access-fvmkk") pod "2db1b143-b7ce-4cc4-8412-6e3402508e98" (UID: "2db1b143-b7ce-4cc4-8412-6e3402508e98"). InnerVolumeSpecName "kube-api-access-fvmkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.038259 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2db1b143-b7ce-4cc4-8412-6e3402508e98" (UID: "2db1b143-b7ce-4cc4-8412-6e3402508e98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.070458 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2db1b143-b7ce-4cc4-8412-6e3402508e98-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.070722 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmkk\" (UniqueName: \"kubernetes.io/projected/2db1b143-b7ce-4cc4-8412-6e3402508e98-kube-api-access-fvmkk\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.070787 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.086460 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data" (OuterVolumeSpecName: "config-data") pod "2db1b143-b7ce-4cc4-8412-6e3402508e98" (UID: "2db1b143-b7ce-4cc4-8412-6e3402508e98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.171678 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerStarted","Data":"6dfbd85b304c02f877866207f444b774b642a322a7f77ec2f38d3ac0a44c4069"} Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.172748 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db1b143-b7ce-4cc4-8412-6e3402508e98-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.181035 4782 generic.go:334] "Generic (PLEG): container finished" podID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" exitCode=137 Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.181092 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2db1b143-b7ce-4cc4-8412-6e3402508e98","Type":"ContainerDied","Data":"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b"} Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.181119 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"2db1b143-b7ce-4cc4-8412-6e3402508e98","Type":"ContainerDied","Data":"4ec310566051baf8561483fe2226455733ca5f420213de022c0f54091e06ff46"} Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.181136 4782 scope.go:117] "RemoveContainer" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.181210 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.199947 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerStarted","Data":"782ccc4ae9c3383a09060a6d55285a72a4db075adc88a3e58d486e1f54464b9c"} Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.229697 4782 scope.go:117] "RemoveContainer" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.234920 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api-log" containerID="cri-o://47991c21381fd580ecc306b0e7358ab9e8a65c7e98ec6062c9a5e8a145fe5bb6" gracePeriod=30 Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.235105 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api" containerID="cri-o://3a14d2040f8c9fffb42b4ae1db2b12d191389741b3b6ce4726e33ea27a308886" gracePeriod=30 Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.235124 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" event={"ID":"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5","Type":"ContainerStarted","Data":"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59"} Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.236103 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.236169 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:49:55 crc kubenswrapper[4782]: E0130 18:49:55.237482 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b\": container with ID starting with 89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b not found: ID does not exist" containerID="89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.237525 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b"} err="failed to get container status \"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b\": rpc error: code = NotFound desc = could not find container \"89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b\": container with ID starting with 89861bce833ba9031c1a3eed484d2539ba8a122c88d6671406386429e41d778b not found: ID does not exist" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.250133 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.269171 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.305385 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:55 crc kubenswrapper[4782]: E0130 18:49:55.306040 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303f380e-9ed9-4b93-a654-6bff8df34a6d" containerName="init" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.306052 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="303f380e-9ed9-4b93-a654-6bff8df34a6d" containerName="init" Jan 30 18:49:55 crc kubenswrapper[4782]: E0130 18:49:55.306078 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.306085 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.306275 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="303f380e-9ed9-4b93-a654-6bff8df34a6d" containerName="init" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.306300 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" containerName="watcher-applier" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.306905 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.316238 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.325461 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.346527 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.60676845 podStartE2EDuration="8.346508912s" podCreationTimestamp="2026-01-30 18:49:47 +0000 UTC" firstStartedPulling="2026-01-30 18:49:50.426725148 +0000 UTC m=+1166.695103173" lastFinishedPulling="2026-01-30 18:49:51.16646561 +0000 UTC m=+1167.434843635" observedRunningTime="2026-01-30 18:49:55.281181447 +0000 UTC m=+1171.549559472" watchObservedRunningTime="2026-01-30 18:49:55.346508912 +0000 UTC m=+1171.614886937" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.351377 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" podStartSLOduration=4.351368912 podStartE2EDuration="4.351368912s" podCreationTimestamp="2026-01-30 18:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:55.335980312 +0000 UTC m=+1171.604358337" watchObservedRunningTime="2026-01-30 18:49:55.351368912 +0000 UTC m=+1171.619746937" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.379359 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-config-data\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.379411 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzs6\" (UniqueName: \"kubernetes.io/projected/1e15b60d-e1ab-4144-a82d-021b51750157-kube-api-access-srzs6\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.379441 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.379477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e15b60d-e1ab-4144-a82d-021b51750157-logs\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.481805 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-config-data\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.481866 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srzs6\" (UniqueName: \"kubernetes.io/projected/1e15b60d-e1ab-4144-a82d-021b51750157-kube-api-access-srzs6\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.481904 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.481944 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e15b60d-e1ab-4144-a82d-021b51750157-logs\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.482619 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e15b60d-e1ab-4144-a82d-021b51750157-logs\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.485801 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.486639 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e15b60d-e1ab-4144-a82d-021b51750157-config-data\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.496810 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzs6\" (UniqueName: \"kubernetes.io/projected/1e15b60d-e1ab-4144-a82d-021b51750157-kube-api-access-srzs6\") pod \"watcher-applier-0\" (UID: \"1e15b60d-e1ab-4144-a82d-021b51750157\") " pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.608693 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.938732 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-695d477669-wlmct"] Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.941370 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.944683 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.944881 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 18:49:55 crc kubenswrapper[4782]: I0130 18:49:55.967936 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695d477669-wlmct"] Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005111 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-internal-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005249 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005375 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-public-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005441 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-httpd-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005544 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-ovndb-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005636 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-combined-ca-bundle\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.005768 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h84p\" (UniqueName: \"kubernetes.io/projected/ed784e83-4524-4c3f-8697-ea3821f297b1-kube-api-access-5h84p\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107687 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-public-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107730 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-httpd-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107785 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-ovndb-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107831 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-combined-ca-bundle\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107873 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h84p\" (UniqueName: \"kubernetes.io/projected/ed784e83-4524-4c3f-8697-ea3821f297b1-kube-api-access-5h84p\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107920 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-internal-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.107937 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.113006 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.115174 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-ovndb-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.115385 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-combined-ca-bundle\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.115623 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-internal-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.117176 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-httpd-config\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.117988 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed784e83-4524-4c3f-8697-ea3821f297b1-public-tls-certs\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.125719 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h84p\" (UniqueName: \"kubernetes.io/projected/ed784e83-4524-4c3f-8697-ea3821f297b1-kube-api-access-5h84p\") pod \"neutron-695d477669-wlmct\" (UID: \"ed784e83-4524-4c3f-8697-ea3821f297b1\") " pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.220285 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.258843 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerStarted","Data":"fc68a86814f9f94327773bd98736a48827e0e48149561cc98487f4e33a442e5f"} Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.280454 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerStarted","Data":"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4"} Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.287364 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.377816 4782 generic.go:334] "Generic (PLEG): container finished" podID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerID="3a14d2040f8c9fffb42b4ae1db2b12d191389741b3b6ce4726e33ea27a308886" exitCode=0 Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.377844 4782 generic.go:334] "Generic (PLEG): container finished" podID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerID="47991c21381fd580ecc306b0e7358ab9e8a65c7e98ec6062c9a5e8a145fe5bb6" exitCode=143 Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.377886 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerDied","Data":"3a14d2040f8c9fffb42b4ae1db2b12d191389741b3b6ce4726e33ea27a308886"} Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.377913 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerDied","Data":"47991c21381fd580ecc306b0e7358ab9e8a65c7e98ec6062c9a5e8a145fe5bb6"} Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.385272 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerStarted","Data":"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a"} Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.466818 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db1b143-b7ce-4cc4-8412-6e3402508e98" path="/var/lib/kubelet/pods/2db1b143-b7ce-4cc4-8412-6e3402508e98/volumes" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.582181 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629093 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvgmx\" (UniqueName: \"kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629185 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629212 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629279 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629338 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629360 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.629402 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs\") pod \"20f1886b-941e-48f2-baf7-0116fe6b689e\" (UID: \"20f1886b-941e-48f2-baf7-0116fe6b689e\") " Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.630257 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs" (OuterVolumeSpecName: "logs") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.634957 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.652141 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts" (OuterVolumeSpecName: "scripts") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.652179 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.652337 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx" (OuterVolumeSpecName: "kube-api-access-zvgmx") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "kube-api-access-zvgmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.692323 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.722348 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data" (OuterVolumeSpecName: "config-data") pod "20f1886b-941e-48f2-baf7-0116fe6b689e" (UID: "20f1886b-941e-48f2-baf7-0116fe6b689e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734592 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734638 4782 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20f1886b-941e-48f2-baf7-0116fe6b689e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734651 4782 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734667 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734679 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20f1886b-941e-48f2-baf7-0116fe6b689e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734690 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20f1886b-941e-48f2-baf7-0116fe6b689e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:56 crc kubenswrapper[4782]: I0130 18:49:56.734700 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvgmx\" (UniqueName: \"kubernetes.io/projected/20f1886b-941e-48f2-baf7-0116fe6b689e-kube-api-access-zvgmx\") on node \"crc\" DevicePath \"\"" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.109165 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-695d477669-wlmct"] Jan 30 18:49:57 crc kubenswrapper[4782]: W0130 18:49:57.115207 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded784e83_4524_4c3f_8697_ea3821f297b1.slice/crio-9a2d9d71518a5a397a84b65cba6eea6c11532b550bbd48596ddaf9fbd406475e WatchSource:0}: Error finding container 9a2d9d71518a5a397a84b65cba6eea6c11532b550bbd48596ddaf9fbd406475e: Status 404 returned error can't find the container with id 9a2d9d71518a5a397a84b65cba6eea6c11532b550bbd48596ddaf9fbd406475e Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.394490 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e15b60d-e1ab-4144-a82d-021b51750157","Type":"ContainerStarted","Data":"0862796165fccac2c267243289993502e9c6dd842872a95c165c790a3bb5e1c5"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.394811 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1e15b60d-e1ab-4144-a82d-021b51750157","Type":"ContainerStarted","Data":"7f9d6c53a138783220996ba77c2c8bbbb78a9ffb52408acb0c549b041829c794"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.398465 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerStarted","Data":"0292bcdabb014299dd67bd261cac8e249bb88360ac0fe174654c0b64b7c6741c"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.400770 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695d477669-wlmct" event={"ID":"ed784e83-4524-4c3f-8697-ea3821f297b1","Type":"ContainerStarted","Data":"9a2d9d71518a5a397a84b65cba6eea6c11532b550bbd48596ddaf9fbd406475e"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.402957 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerStarted","Data":"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.405285 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"20f1886b-941e-48f2-baf7-0116fe6b689e","Type":"ContainerDied","Data":"29dc8d8dfe511aab5198ec37a7f525f9a5d3be9f61db4b7122de4421ec08c8c0"} Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.405335 4782 scope.go:117] "RemoveContainer" containerID="3a14d2040f8c9fffb42b4ae1db2b12d191389741b3b6ce4726e33ea27a308886" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.405450 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.429910 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=2.42989128 podStartE2EDuration="2.42989128s" podCreationTimestamp="2026-01-30 18:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:57.421191305 +0000 UTC m=+1173.689569330" watchObservedRunningTime="2026-01-30 18:49:57.42989128 +0000 UTC m=+1173.698269305" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.467733 4782 scope.go:117] "RemoveContainer" containerID="47991c21381fd580ecc306b0e7358ab9e8a65c7e98ec6062c9a5e8a145fe5bb6" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.474372 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.47434887 podStartE2EDuration="6.47434887s" podCreationTimestamp="2026-01-30 18:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:57.463499541 +0000 UTC m=+1173.731877566" watchObservedRunningTime="2026-01-30 18:49:57.47434887 +0000 UTC m=+1173.742726895" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.504409 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.504394593 podStartE2EDuration="6.504394593s" podCreationTimestamp="2026-01-30 18:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:57.496798245 +0000 UTC m=+1173.765176270" watchObservedRunningTime="2026-01-30 18:49:57.504394593 +0000 UTC m=+1173.772772618" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.528111 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.544768 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.595337 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: E0130 18:49:57.595785 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api-log" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.595802 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api-log" Jan 30 18:49:57 crc kubenswrapper[4782]: E0130 18:49:57.595816 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.595823 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.596008 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api-log" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.596023 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" containerName="cinder-api" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.597188 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.611210 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.611391 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.611480 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.625436 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.634487 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.675572 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.697920 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.697975 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9j94\" (UniqueName: \"kubernetes.io/projected/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-kube-api-access-j9j94\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698048 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698109 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698149 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698173 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-logs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698201 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698299 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-scripts\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.698331 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800678 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9j94\" (UniqueName: \"kubernetes.io/projected/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-kube-api-access-j9j94\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800742 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800767 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800790 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800812 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-logs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800835 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800864 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-scripts\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800893 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.800945 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.802741 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.801893 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-logs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.805415 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-scripts\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.813864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.815119 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.815882 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.816005 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.816888 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.826740 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9j94\" (UniqueName: \"kubernetes.io/projected/d3d752ed-dc31-49b8-80ce-b3b94f07dcf3-kube-api-access-j9j94\") pod \"cinder-api-0\" (UID: \"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3\") " pod="openstack/cinder-api-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.853675 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.863808 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.887133 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.887755 4782 scope.go:117] "RemoveContainer" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" Jan 30 18:49:57 crc kubenswrapper[4782]: E0130 18:49:57.887959 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(e82abe4a-d9ad-47dd-bd5c-2704052ba388)\"" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" Jan 30 18:49:57 crc kubenswrapper[4782]: I0130 18:49:57.927793 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.082711 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.418111 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.445149 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f1886b-941e-48f2-baf7-0116fe6b689e" path="/var/lib/kubelet/pods/20f1886b-941e-48f2-baf7-0116fe6b689e/volumes" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.445999 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.446326 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.446341 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerStarted","Data":"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89"} Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.449498 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695d477669-wlmct" event={"ID":"ed784e83-4524-4c3f-8697-ea3821f297b1","Type":"ContainerStarted","Data":"7d2605d8f50ddd792fd1b91180a66a69a159e9d18f558b6b1ea7f6ad2251a31e"} Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.449539 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-695d477669-wlmct" event={"ID":"ed784e83-4524-4c3f-8697-ea3821f297b1","Type":"ContainerStarted","Data":"7e75f3580625c60b8041e23c6e179942841212189c15fe8c22c489d7932b8d8d"} Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.470095 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.672293335 podStartE2EDuration="8.470074372s" podCreationTimestamp="2026-01-30 18:49:50 +0000 UTC" firstStartedPulling="2026-01-30 18:49:52.190205484 +0000 UTC m=+1168.458583509" lastFinishedPulling="2026-01-30 18:49:57.987986521 +0000 UTC m=+1174.256364546" observedRunningTime="2026-01-30 18:49:58.457637475 +0000 UTC m=+1174.726015500" watchObservedRunningTime="2026-01-30 18:49:58.470074372 +0000 UTC m=+1174.738452397" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.541706 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-695d477669-wlmct" podStartSLOduration=3.541684252 podStartE2EDuration="3.541684252s" podCreationTimestamp="2026-01-30 18:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:49:58.527886751 +0000 UTC m=+1174.796264776" watchObservedRunningTime="2026-01-30 18:49:58.541684252 +0000 UTC m=+1174.810062277" Jan 30 18:49:58 crc kubenswrapper[4782]: I0130 18:49:58.612889 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.463456 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-log" containerID="cri-o://fc68a86814f9f94327773bd98736a48827e0e48149561cc98487f4e33a442e5f" gracePeriod=30 Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.463579 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3","Type":"ContainerStarted","Data":"d2982179841f991de9def4e13272f716b12fb663da1a0986c5fcd81e3723439c"} Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.463787 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="cinder-scheduler" containerID="cri-o://728372e88f1f143b41acf904b9c9e1c86e75ca1dca5acbea24d0b4ec4389aa37" gracePeriod=30 Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.465152 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-log" containerID="cri-o://b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4" gracePeriod=30 Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.465852 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-695d477669-wlmct" Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.466322 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-httpd" containerID="cri-o://0292bcdabb014299dd67bd261cac8e249bb88360ac0fe174654c0b64b7c6741c" gracePeriod=30 Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.467183 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="probe" containerID="cri-o://782ccc4ae9c3383a09060a6d55285a72a4db075adc88a3e58d486e1f54464b9c" gracePeriod=30 Jan 30 18:49:59 crc kubenswrapper[4782]: I0130 18:49:59.467362 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-httpd" containerID="cri-o://e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343" gracePeriod=30 Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.110826 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.460314 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-564c766f5d-2hhs6"] Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.462558 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.465104 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.465672 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.489520 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-564c766f5d-2hhs6"] Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.512456 4782 generic.go:334] "Generic (PLEG): container finished" podID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerID="fc68a86814f9f94327773bd98736a48827e0e48149561cc98487f4e33a442e5f" exitCode=143 Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.512815 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerDied","Data":"fc68a86814f9f94327773bd98736a48827e0e48149561cc98487f4e33a442e5f"} Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.516143 4782 generic.go:334] "Generic (PLEG): container finished" podID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerID="b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4" exitCode=143 Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.516244 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerDied","Data":"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4"} Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.518322 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3","Type":"ContainerStarted","Data":"e06941cf5846fedac70441e1689d6aac766067961439b0bda834c3e63a6fc3d3"} Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592596 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-combined-ca-bundle\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592699 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162dd27-124a-4e1c-8a8c-51c4e47fce04-logs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592799 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-public-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592817 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9xd\" (UniqueName: \"kubernetes.io/projected/5162dd27-124a-4e1c-8a8c-51c4e47fce04-kube-api-access-pv9xd\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592834 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-internal-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592859 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.592890 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data-custom\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.609992 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699372 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-public-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9xd\" (UniqueName: \"kubernetes.io/projected/5162dd27-124a-4e1c-8a8c-51c4e47fce04-kube-api-access-pv9xd\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699459 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-internal-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699481 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699555 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data-custom\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699610 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-combined-ca-bundle\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.699732 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162dd27-124a-4e1c-8a8c-51c4e47fce04-logs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.700211 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162dd27-124a-4e1c-8a8c-51c4e47fce04-logs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.707818 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-public-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.714194 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-internal-tls-certs\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.715545 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.719490 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-config-data-custom\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.721636 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9xd\" (UniqueName: \"kubernetes.io/projected/5162dd27-124a-4e1c-8a8c-51c4e47fce04-kube-api-access-pv9xd\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.724261 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162dd27-124a-4e1c-8a8c-51c4e47fce04-combined-ca-bundle\") pod \"barbican-api-564c766f5d-2hhs6\" (UID: \"5162dd27-124a-4e1c-8a8c-51c4e47fce04\") " pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:00 crc kubenswrapper[4782]: I0130 18:50:00.804790 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.367031 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514594 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c92pn\" (UniqueName: \"kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514684 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514782 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514832 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514860 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514895 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.514926 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data\") pod \"1c74107c-ceda-4cc8-becc-292f998dc2e6\" (UID: \"1c74107c-ceda-4cc8-becc-292f998dc2e6\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.515954 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.519325 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs" (OuterVolumeSpecName: "logs") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.521242 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts" (OuterVolumeSpecName: "scripts") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.524423 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.531759 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn" (OuterVolumeSpecName: "kube-api-access-c92pn") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "kube-api-access-c92pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.542189 4782 generic.go:334] "Generic (PLEG): container finished" podID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerID="782ccc4ae9c3383a09060a6d55285a72a4db075adc88a3e58d486e1f54464b9c" exitCode=0 Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.542365 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerDied","Data":"782ccc4ae9c3383a09060a6d55285a72a4db075adc88a3e58d486e1f54464b9c"} Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.545316 4782 generic.go:334] "Generic (PLEG): container finished" podID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerID="0292bcdabb014299dd67bd261cac8e249bb88360ac0fe174654c0b64b7c6741c" exitCode=0 Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.545453 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerDied","Data":"0292bcdabb014299dd67bd261cac8e249bb88360ac0fe174654c0b64b7c6741c"} Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.554586 4782 generic.go:334] "Generic (PLEG): container finished" podID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerID="e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343" exitCode=0 Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.554633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerDied","Data":"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343"} Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.554844 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1c74107c-ceda-4cc8-becc-292f998dc2e6","Type":"ContainerDied","Data":"6dfbd85b304c02f877866207f444b774b642a322a7f77ec2f38d3ac0a44c4069"} Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.554860 4782 scope.go:117] "RemoveContainer" containerID="e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.555003 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.570329 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.585611 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.588370 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data" (OuterVolumeSpecName: "config-data") pod "1c74107c-ceda-4cc8-becc-292f998dc2e6" (UID: "1c74107c-ceda-4cc8-becc-292f998dc2e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.594363 4782 scope.go:117] "RemoveContainer" containerID="b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617178 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617582 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617685 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617770 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617847 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c74107c-ceda-4cc8-becc-292f998dc2e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.617928 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c92pn\" (UniqueName: \"kubernetes.io/projected/1c74107c-ceda-4cc8-becc-292f998dc2e6-kube-api-access-c92pn\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.618002 4782 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c74107c-ceda-4cc8-becc-292f998dc2e6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.654069 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.719605 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-564c766f5d-2hhs6"] Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720212 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wv\" (UniqueName: \"kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720272 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720511 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720542 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720562 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720598 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720635 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data\") pod \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\" (UID: \"86c7465b-12be-418c-b4c0-53ad1b9e07a5\") " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.720999 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs" (OuterVolumeSpecName: "logs") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.721280 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.721297 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.724242 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv" (OuterVolumeSpecName: "kube-api-access-lz9wv") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "kube-api-access-lz9wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.725909 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts" (OuterVolumeSpecName: "scripts") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.727547 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.737757 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.775351 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.788520 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.805210 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data" (OuterVolumeSpecName: "config-data") pod "86c7465b-12be-418c-b4c0-53ad1b9e07a5" (UID: "86c7465b-12be-418c-b4c0-53ad1b9e07a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835164 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835215 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835240 4782 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86c7465b-12be-418c-b4c0-53ad1b9e07a5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835250 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835262 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wv\" (UniqueName: \"kubernetes.io/projected/86c7465b-12be-418c-b4c0-53ad1b9e07a5-kube-api-access-lz9wv\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.835278 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c7465b-12be-418c-b4c0-53ad1b9e07a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.905222 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.905476 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="dnsmasq-dns" containerID="cri-o://65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1" gracePeriod=10 Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.908485 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 18:50:01 crc kubenswrapper[4782]: I0130 18:50:01.937380 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.019382 4782 scope.go:117] "RemoveContainer" containerID="e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.020009 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343\": container with ID starting with e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343 not found: ID does not exist" containerID="e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.020037 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343"} err="failed to get container status \"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343\": rpc error: code = NotFound desc = could not find container \"e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343\": container with ID starting with e0b382f41d184f4dcb42cf98d8774d57dca0cbc4c6952ed141c3f1d77e8b9343 not found: ID does not exist" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.020058 4782 scope.go:117] "RemoveContainer" containerID="b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.020501 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4\": container with ID starting with b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4 not found: ID does not exist" containerID="b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.020526 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4"} err="failed to get container status \"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4\": rpc error: code = NotFound desc = could not find container \"b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4\": container with ID starting with b6e6f10a96d08aeb50e505d46729243f3c404dca6651d65dc75053307cdeccb4 not found: ID does not exist" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.050155 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.059701 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.069861 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.070267 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070285 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.070300 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070307 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.070331 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070338 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.070347 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070353 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070553 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070578 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070590 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-log" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.070599 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" containerName="glance-httpd" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.071673 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.075570 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.081720 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.084642 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148312 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148362 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8jm\" (UniqueName: \"kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148382 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148588 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148636 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148717 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.148907 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250265 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250388 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250416 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8jm\" (UniqueName: \"kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250452 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250499 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250518 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.250550 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.251647 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.252936 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.253044 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.257313 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.259534 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.261123 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.262914 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.267900 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8jm\" (UniqueName: \"kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.302683 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.396675 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.421145 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c74107c-ceda-4cc8-becc-292f998dc2e6" path="/var/lib/kubelet/pods/1c74107c-ceda-4cc8-becc-292f998dc2e6/volumes" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.462858 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558285 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558373 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shsqj\" (UniqueName: \"kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558441 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558473 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558495 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.558555 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.569387 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj" (OuterVolumeSpecName: "kube-api-access-shsqj") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "kube-api-access-shsqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.575877 4782 generic.go:334] "Generic (PLEG): container finished" podID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerID="728372e88f1f143b41acf904b9c9e1c86e75ca1dca5acbea24d0b4ec4389aa37" exitCode=0 Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.575940 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerDied","Data":"728372e88f1f143b41acf904b9c9e1c86e75ca1dca5acbea24d0b4ec4389aa37"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.583845 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-564c766f5d-2hhs6" event={"ID":"5162dd27-124a-4e1c-8a8c-51c4e47fce04","Type":"ContainerStarted","Data":"64c4eaab201d403a94376eb94422ef75abcf3a2a690a513643ce46225f26671e"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.583906 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-564c766f5d-2hhs6" event={"ID":"5162dd27-124a-4e1c-8a8c-51c4e47fce04","Type":"ContainerStarted","Data":"d94e17aeae9235955fc3fd2b713e099fbffd06d9f394df68e865fb513dc18cfe"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.585341 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3d752ed-dc31-49b8-80ce-b3b94f07dcf3","Type":"ContainerStarted","Data":"82612bfd5c4ead011f3e58a74027cb3b019dd45a8aa3a7aa868f3f4a91688a13"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.586423 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.588769 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86c7465b-12be-418c-b4c0-53ad1b9e07a5","Type":"ContainerDied","Data":"8975f5c42ddc8a34bfbead8ec48bb2d66f9e11e55926ab3f2361adedc8f122f9"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.588808 4782 scope.go:117] "RemoveContainer" containerID="0292bcdabb014299dd67bd261cac8e249bb88360ac0fe174654c0b64b7c6741c" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.589143 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.605434 4782 generic.go:334] "Generic (PLEG): container finished" podID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerID="65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1" exitCode=0 Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.606366 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.606492 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.608687 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerDied","Data":"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.608747 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" event={"ID":"60cc626f-3c26-4417-87f0-5000cdbaadda","Type":"ContainerDied","Data":"4ee324cff937ca7e6b51908207a474bd16466e617a15c1cd7df120a4654a88b0"} Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.622806 4782 scope.go:117] "RemoveContainer" containerID="fc68a86814f9f94327773bd98736a48827e0e48149561cc98487f4e33a442e5f" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.639877 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.63985104 podStartE2EDuration="5.63985104s" podCreationTimestamp="2026-01-30 18:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:02.623503126 +0000 UTC m=+1178.891881151" watchObservedRunningTime="2026-01-30 18:50:02.63985104 +0000 UTC m=+1178.908229065" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.657239 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.662725 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.663096 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") pod \"60cc626f-3c26-4417-87f0-5000cdbaadda\" (UID: \"60cc626f-3c26-4417-87f0-5000cdbaadda\") " Jan 30 18:50:02 crc kubenswrapper[4782]: W0130 18:50:02.664710 4782 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/60cc626f-3c26-4417-87f0-5000cdbaadda/volumes/kubernetes.io~configmap/dns-swift-storage-0 Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.664727 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.678159 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.681930 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.684068 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config" (OuterVolumeSpecName: "config") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.687848 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.688210 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="dnsmasq-dns" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.688262 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="dnsmasq-dns" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.688290 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="init" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.688297 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="init" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.688633 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="dnsmasq-dns" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.689544 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.695138 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.698017 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.700897 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60cc626f-3c26-4417-87f0-5000cdbaadda" (UID: "60cc626f-3c26-4417-87f0-5000cdbaadda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701743 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shsqj\" (UniqueName: \"kubernetes.io/projected/60cc626f-3c26-4417-87f0-5000cdbaadda-kube-api-access-shsqj\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701764 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701775 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701783 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701799 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.701807 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cc626f-3c26-4417-87f0-5000cdbaadda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.709850 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.782155 4782 scope.go:117] "RemoveContainer" containerID="65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.783058 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.802895 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.802938 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.802972 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.802997 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.803074 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.803107 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp2kq\" (UniqueName: \"kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.803130 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.803157 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.836566 4782 scope.go:117] "RemoveContainer" containerID="409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.888385 4782 scope.go:117] "RemoveContainer" containerID="65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.894928 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1\": container with ID starting with 65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1 not found: ID does not exist" containerID="65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.894970 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1"} err="failed to get container status \"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1\": rpc error: code = NotFound desc = could not find container \"65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1\": container with ID starting with 65beffd79b25125f3d7bafb4040ce976f8caa3649197432c1aaa8b02c245d1a1 not found: ID does not exist" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.894994 4782 scope.go:117] "RemoveContainer" containerID="409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe" Jan 30 18:50:02 crc kubenswrapper[4782]: E0130 18:50:02.896782 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe\": container with ID starting with 409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe not found: ID does not exist" containerID="409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.896811 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe"} err="failed to get container status \"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe\": rpc error: code = NotFound desc = could not find container \"409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe\": container with ID starting with 409b84ba2821745c66b6a474a0392400bf30f097b6d227df74160e3b90324ffe not found: ID does not exist" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906365 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906508 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906538 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906575 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zkvq\" (UniqueName: \"kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906612 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906679 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts\") pod \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\" (UID: \"c143d7ec-7d96-4365-8e88-c2f41c55ebf7\") " Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906899 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906941 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp2kq\" (UniqueName: \"kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906967 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.906994 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.907031 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.907053 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.907079 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.907100 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.912599 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.917336 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.923403 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.927323 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.927544 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.935447 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.935480 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.947186 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts" (OuterVolumeSpecName: "scripts") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.948268 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.951933 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp2kq\" (UniqueName: \"kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.977570 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq" (OuterVolumeSpecName: "kube-api-access-8zkvq") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "kube-api-access-8zkvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.979829 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:02 crc kubenswrapper[4782]: I0130 18:50:02.989459 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.010842 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zkvq\" (UniqueName: \"kubernetes.io/projected/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-kube-api-access-8zkvq\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.010886 4782 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.010900 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.010909 4782 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.028954 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.052287 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.061534 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cc4f6997f-vmkbp"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.112561 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.115357 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.118022 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data" (OuterVolumeSpecName: "config-data") pod "c143d7ec-7d96-4365-8e88-c2f41c55ebf7" (UID: "c143d7ec-7d96-4365-8e88-c2f41c55ebf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.123475 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: W0130 18:50:03.124761 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb19d14_c078_4fee_81ab_6246f1f56059.slice/crio-78354cbfeccc9b5b4c43cf77cf16ebca46fa797416af14add9382d477d674563 WatchSource:0}: Error finding container 78354cbfeccc9b5b4c43cf77cf16ebca46fa797416af14add9382d477d674563: Status 404 returned error can't find the container with id 78354cbfeccc9b5b4c43cf77cf16ebca46fa797416af14add9382d477d674563 Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.214282 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c143d7ec-7d96-4365-8e88-c2f41c55ebf7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.660308 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c143d7ec-7d96-4365-8e88-c2f41c55ebf7","Type":"ContainerDied","Data":"2b70e41b8a2308e8f7732cb323228614011db35236c64f6b0353bf1997b71928"} Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.661551 4782 scope.go:117] "RemoveContainer" containerID="782ccc4ae9c3383a09060a6d55285a72a4db075adc88a3e58d486e1f54464b9c" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.661487 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.664726 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-564c766f5d-2hhs6" event={"ID":"5162dd27-124a-4e1c-8a8c-51c4e47fce04","Type":"ContainerStarted","Data":"682dd014ab3a6f626bf9543629b529df5d609370a678c2cdd8c3ad2439f62b2e"} Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.665268 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.665292 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.669547 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerStarted","Data":"78354cbfeccc9b5b4c43cf77cf16ebca46fa797416af14add9382d477d674563"} Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.715738 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.720526 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-564c766f5d-2hhs6" podStartSLOduration=3.720502332 podStartE2EDuration="3.720502332s" podCreationTimestamp="2026-01-30 18:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:03.698325784 +0000 UTC m=+1179.966703819" watchObservedRunningTime="2026-01-30 18:50:03.720502332 +0000 UTC m=+1179.988880357" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.734377 4782 scope.go:117] "RemoveContainer" containerID="728372e88f1f143b41acf904b9c9e1c86e75ca1dca5acbea24d0b4ec4389aa37" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.780996 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.797114 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.822177 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: E0130 18:50:03.822789 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="cinder-scheduler" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.822809 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="cinder-scheduler" Jan 30 18:50:03 crc kubenswrapper[4782]: E0130 18:50:03.822830 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="probe" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.822837 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="probe" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.823095 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="cinder-scheduler" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.823127 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" containerName="probe" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.824314 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.834110 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.835545 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.929952 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.930212 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.930302 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-scripts\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.930453 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.930611 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ac7726e-05ca-4e51-99e2-cce317290a59-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:03 crc kubenswrapper[4782]: I0130 18:50:03.930648 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdv5c\" (UniqueName: \"kubernetes.io/projected/5ac7726e-05ca-4e51-99e2-cce317290a59-kube-api-access-jdv5c\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.032728 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ac7726e-05ca-4e51-99e2-cce317290a59-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.032799 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdv5c\" (UniqueName: \"kubernetes.io/projected/5ac7726e-05ca-4e51-99e2-cce317290a59-kube-api-access-jdv5c\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.032870 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.032940 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.033011 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-scripts\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.033054 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.033218 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5ac7726e-05ca-4e51-99e2-cce317290a59-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.036739 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.036845 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.040530 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-scripts\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.051346 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac7726e-05ca-4e51-99e2-cce317290a59-config-data\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.067808 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdv5c\" (UniqueName: \"kubernetes.io/projected/5ac7726e-05ca-4e51-99e2-cce317290a59-kube-api-access-jdv5c\") pod \"cinder-scheduler-0\" (UID: \"5ac7726e-05ca-4e51-99e2-cce317290a59\") " pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.176466 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.435397 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" path="/var/lib/kubelet/pods/60cc626f-3c26-4417-87f0-5000cdbaadda/volumes" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.443474 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c7465b-12be-418c-b4c0-53ad1b9e07a5" path="/var/lib/kubelet/pods/86c7465b-12be-418c-b4c0-53ad1b9e07a5/volumes" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.449928 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c143d7ec-7d96-4365-8e88-c2f41c55ebf7" path="/var/lib/kubelet/pods/c143d7ec-7d96-4365-8e88-c2f41c55ebf7/volumes" Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.618115 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.721454 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5ac7726e-05ca-4e51-99e2-cce317290a59","Type":"ContainerStarted","Data":"d1cb0dc3ac77b6ca50f5bb6bc6329e9c0de3f055263a60025df1a63e8c695d05"} Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.725043 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerStarted","Data":"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f"} Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.725119 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerStarted","Data":"faa0ae45ff10cff3b9064e290118521473924f5f36322f5355b224fd3af6afb1"} Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.739783 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerStarted","Data":"6657418c9a7b7eccd97f606c95485ebde2d2ee5d06489b30bf910d1a6569a0a9"} Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.739850 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerStarted","Data":"cc17318eb81e0fab13522608b3af76a8b5e1c1b435c697df5058572259139b1c"} Jan 30 18:50:04 crc kubenswrapper[4782]: I0130 18:50:04.769534 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.769513822 podStartE2EDuration="2.769513822s" podCreationTimestamp="2026-01-30 18:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:04.768047946 +0000 UTC m=+1181.036425981" watchObservedRunningTime="2026-01-30 18:50:04.769513822 +0000 UTC m=+1181.037891847" Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.596728 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5c6f877b5f-8gdbg" Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.608864 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.657787 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.750820 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5ac7726e-05ca-4e51-99e2-cce317290a59","Type":"ContainerStarted","Data":"b659cd52a76dd376d3e0157a6dd7e19e497a77986877571a2d7fa651c07ce527"} Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.754877 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerStarted","Data":"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc"} Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.774014 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.77399745 podStartE2EDuration="3.77399745s" podCreationTimestamp="2026-01-30 18:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:05.773339704 +0000 UTC m=+1182.041717749" watchObservedRunningTime="2026-01-30 18:50:05.77399745 +0000 UTC m=+1182.042375465" Jan 30 18:50:05 crc kubenswrapper[4782]: I0130 18:50:05.797169 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Jan 30 18:50:06 crc kubenswrapper[4782]: I0130 18:50:06.779802 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5ac7726e-05ca-4e51-99e2-cce317290a59","Type":"ContainerStarted","Data":"d3e746029b0adbb71505acdd10234bdc56d1a832d31dcc1c59b3626c1786c62b"} Jan 30 18:50:06 crc kubenswrapper[4782]: I0130 18:50:06.805386 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.805366394 podStartE2EDuration="3.805366394s" podCreationTimestamp="2026-01-30 18:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:06.797767356 +0000 UTC m=+1183.066145381" watchObservedRunningTime="2026-01-30 18:50:06.805366394 +0000 UTC m=+1183.073744419" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.298743 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.300191 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.305272 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vgtgm" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.305465 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.308701 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.314410 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.402480 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5cc4f6997f-vmkbp" podUID="60cc626f-3c26-4417-87f0-5000cdbaadda" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: i/o timeout" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.408190 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.408334 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config-secret\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.408366 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.408480 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xfcs\" (UniqueName: \"kubernetes.io/projected/8322d742-28bf-4eb4-ba33-8e37da0780f1-kube-api-access-5xfcs\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.511210 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xfcs\" (UniqueName: \"kubernetes.io/projected/8322d742-28bf-4eb4-ba33-8e37da0780f1-kube-api-access-5xfcs\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.511303 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.511474 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config-secret\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.511507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.513803 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.517299 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-openstack-config-secret\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.526389 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322d742-28bf-4eb4-ba33-8e37da0780f1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.540829 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xfcs\" (UniqueName: \"kubernetes.io/projected/8322d742-28bf-4eb4-ba33-8e37da0780f1-kube-api-access-5xfcs\") pod \"openstackclient\" (UID: \"8322d742-28bf-4eb4-ba33-8e37da0780f1\") " pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.620161 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.887629 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.887966 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:07 crc kubenswrapper[4782]: I0130 18:50:07.888810 4782 scope.go:117] "RemoveContainer" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" Jan 30 18:50:08 crc kubenswrapper[4782]: I0130 18:50:08.135878 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 18:50:08 crc kubenswrapper[4782]: W0130 18:50:08.137143 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8322d742_28bf_4eb4_ba33_8e37da0780f1.slice/crio-e7780b2dab7ffd580ec3dea8f86122d641095687447ba22e67dc70232114e21b WatchSource:0}: Error finding container e7780b2dab7ffd580ec3dea8f86122d641095687447ba22e67dc70232114e21b: Status 404 returned error can't find the container with id e7780b2dab7ffd580ec3dea8f86122d641095687447ba22e67dc70232114e21b Jan 30 18:50:08 crc kubenswrapper[4782]: I0130 18:50:08.813401 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8322d742-28bf-4eb4-ba33-8e37da0780f1","Type":"ContainerStarted","Data":"e7780b2dab7ffd580ec3dea8f86122d641095687447ba22e67dc70232114e21b"} Jan 30 18:50:08 crc kubenswrapper[4782]: I0130 18:50:08.816607 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerStarted","Data":"8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783"} Jan 30 18:50:09 crc kubenswrapper[4782]: I0130 18:50:09.177031 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 18:50:10 crc kubenswrapper[4782]: I0130 18:50:10.084871 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 18:50:10 crc kubenswrapper[4782]: I0130 18:50:10.110048 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 18:50:10 crc kubenswrapper[4782]: I0130 18:50:10.110147 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.719805 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.720557 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6889757c94-v7jr9" Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.892678 4782 generic.go:334] "Generic (PLEG): container finished" podID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerID="8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783" exitCode=1 Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.892986 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerDied","Data":"8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783"} Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.893031 4782 scope.go:117] "RemoveContainer" containerID="36901ad5157bdacabce631de9530a1c5518886322d1f560d0f5925ccf111f8c3" Jan 30 18:50:11 crc kubenswrapper[4782]: I0130 18:50:11.895532 4782 scope.go:117] "RemoveContainer" containerID="8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783" Jan 30 18:50:11 crc kubenswrapper[4782]: E0130 18:50:11.896109 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(e82abe4a-d9ad-47dd-bd5c-2704052ba388)\"" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.397758 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.397799 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.436606 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.451744 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.522389 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.573865 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-564c766f5d-2hhs6" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.627071 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.627300 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cdd7f6f98-s4js9" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api-log" containerID="cri-o://64951b851656ef91c290bdff5f1f904ce39b1113ebe95dc7d8f6c075b5322eaa" gracePeriod=30 Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.627749 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cdd7f6f98-s4js9" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api" containerID="cri-o://5a0161df6f177b020d0463254982fbe7c36b18fb5c70d3da052e57c85fdae7ed" gracePeriod=30 Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.917459 4782 generic.go:334] "Generic (PLEG): container finished" podID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerID="64951b851656ef91c290bdff5f1f904ce39b1113ebe95dc7d8f6c075b5322eaa" exitCode=143 Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.917549 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerDied","Data":"64951b851656ef91c290bdff5f1f904ce39b1113ebe95dc7d8f6c075b5322eaa"} Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.928161 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:12 crc kubenswrapper[4782]: I0130 18:50:12.928188 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.116206 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.116580 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.157572 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.158188 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.935385 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 18:50:13 crc kubenswrapper[4782]: I0130 18:50:13.935690 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.399449 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.946425 4782 generic.go:334] "Generic (PLEG): container finished" podID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerID="441cb98b7bdcb9129ff7e175460fd76e74ce990bde17ddae86c8e1c5a95f2c84" exitCode=137 Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.946782 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerDied","Data":"441cb98b7bdcb9129ff7e175460fd76e74ce990bde17ddae86c8e1c5a95f2c84"} Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.949418 4782 generic.go:334] "Generic (PLEG): container finished" podID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerID="5a0161df6f177b020d0463254982fbe7c36b18fb5c70d3da052e57c85fdae7ed" exitCode=0 Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.949498 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.949507 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:50:14 crc kubenswrapper[4782]: I0130 18:50:14.950634 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerDied","Data":"5a0161df6f177b020d0463254982fbe7c36b18fb5c70d3da052e57c85fdae7ed"} Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.833915 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6989f95847-z8k6r"] Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.835726 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.839562 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.839706 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.839777 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 18:50:15 crc kubenswrapper[4782]: I0130 18:50:15.850528 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6989f95847-z8k6r"] Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006007 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-config-data\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006070 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-internal-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006104 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-run-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006130 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-log-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-combined-ca-bundle\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006176 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-public-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006197 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-etc-swift\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.006250 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpf9\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-kube-api-access-9tpf9\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108340 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpf9\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-kube-api-access-9tpf9\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108478 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-config-data\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108538 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-internal-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108582 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-run-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108617 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-log-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108653 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-combined-ca-bundle\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108685 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-public-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.108714 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-etc-swift\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.109213 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-log-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.109436 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d86a7921-fdce-4a73-ad98-4dc1373c72e2-run-httpd\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.114399 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cdd7f6f98-s4js9" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": dial tcp 10.217.0.173:9311: connect: connection refused" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.114446 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cdd7f6f98-s4js9" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.173:9311/healthcheck\": dial tcp 10.217.0.173:9311: connect: connection refused" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.115896 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-config-data\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.117158 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-internal-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.117402 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-public-tls-certs\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.118113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d86a7921-fdce-4a73-ad98-4dc1373c72e2-combined-ca-bundle\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.118116 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-etc-swift\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.127492 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpf9\" (UniqueName: \"kubernetes.io/projected/d86a7921-fdce-4a73-ad98-4dc1373c72e2-kube-api-access-9tpf9\") pod \"swift-proxy-6989f95847-z8k6r\" (UID: \"d86a7921-fdce-4a73-ad98-4dc1373c72e2\") " pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.168554 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:16 crc kubenswrapper[4782]: I0130 18:50:16.958770 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.389060 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.389183 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.389494 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.389596 4782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.396688 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.527293 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.887507 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.887810 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:17 crc kubenswrapper[4782]: I0130 18:50:17.888737 4782 scope.go:117] "RemoveContainer" containerID="8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783" Jan 30 18:50:17 crc kubenswrapper[4782]: E0130 18:50:17.889019 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(e82abe4a-d9ad-47dd-bd5c-2704052ba388)\"" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.427112 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.427388 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-central-agent" containerID="cri-o://47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6" gracePeriod=30 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.427509 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="proxy-httpd" containerID="cri-o://5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89" gracePeriod=30 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.427543 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="sg-core" containerID="cri-o://288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a" gracePeriod=30 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.427571 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-notification-agent" containerID="cri-o://325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f" gracePeriod=30 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.447521 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.993967 4782 generic.go:334] "Generic (PLEG): container finished" podID="b357c566-0063-4e60-b284-9d4e8911734d" containerID="5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89" exitCode=0 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.994169 4782 generic.go:334] "Generic (PLEG): container finished" podID="b357c566-0063-4e60-b284-9d4e8911734d" containerID="288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a" exitCode=2 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.994180 4782 generic.go:334] "Generic (PLEG): container finished" podID="b357c566-0063-4e60-b284-9d4e8911734d" containerID="47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6" exitCode=0 Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.994173 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerDied","Data":"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89"} Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.994223 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerDied","Data":"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a"} Jan 30 18:50:18 crc kubenswrapper[4782]: I0130 18:50:18.994250 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerDied","Data":"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6"} Jan 30 18:50:19 crc kubenswrapper[4782]: I0130 18:50:19.792597 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:50:19 crc kubenswrapper[4782]: I0130 18:50:19.792661 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:50:20 crc kubenswrapper[4782]: I0130 18:50:20.110307 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b5464cf9b-2tbsc" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.162:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.162:8443: connect: connection refused" Jan 30 18:50:20 crc kubenswrapper[4782]: I0130 18:50:20.899221 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.011460 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.024993 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8322d742-28bf-4eb4-ba33-8e37da0780f1","Type":"ContainerStarted","Data":"2fb5d9292f0047b38860deef89ce65390ebc3b6ed9b5875ad2c29ec8d0020129"} Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.027645 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cdd7f6f98-s4js9" event={"ID":"96b1ca41-cb51-47f5-a52e-fcfba8424503","Type":"ContainerDied","Data":"7e7ccfce383e337bbdf37f09639d15a161917f5de0df8d5b39ecd25b84950d11"} Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.027704 4782 scope.go:117] "RemoveContainer" containerID="5a0161df6f177b020d0463254982fbe7c36b18fb5c70d3da052e57c85fdae7ed" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.027797 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cdd7f6f98-s4js9" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.033383 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b5464cf9b-2tbsc" event={"ID":"b2e303df-cb69-4f2f-909f-a1651d376adc","Type":"ContainerDied","Data":"7089e04dabf15ad197d928d57d70d81c4b4244b3196f530352d3562c2a99555e"} Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.033628 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b5464cf9b-2tbsc" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.035823 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.035884 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.035983 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.036039 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp5sf\" (UniqueName: \"kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.036088 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.036116 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.036152 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key\") pod \"b2e303df-cb69-4f2f-909f-a1651d376adc\" (UID: \"b2e303df-cb69-4f2f-909f-a1651d376adc\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.041501 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs" (OuterVolumeSpecName: "logs") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.053667 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.058953 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf" (OuterVolumeSpecName: "kube-api-access-fp5sf") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "kube-api-access-fp5sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.066679 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.556329291 podStartE2EDuration="14.066660233s" podCreationTimestamp="2026-01-30 18:50:07 +0000 UTC" firstStartedPulling="2026-01-30 18:50:08.13935244 +0000 UTC m=+1184.407730455" lastFinishedPulling="2026-01-30 18:50:20.649683372 +0000 UTC m=+1196.918061397" observedRunningTime="2026-01-30 18:50:21.058315006 +0000 UTC m=+1197.326693031" watchObservedRunningTime="2026-01-30 18:50:21.066660233 +0000 UTC m=+1197.335038258" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.068650 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data" (OuterVolumeSpecName: "config-data") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.097498 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.101359 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts" (OuterVolumeSpecName: "scripts") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.105304 4782 scope.go:117] "RemoveContainer" containerID="64951b851656ef91c290bdff5f1f904ce39b1113ebe95dc7d8f6c075b5322eaa" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.116364 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b2e303df-cb69-4f2f-909f-a1651d376adc" (UID: "b2e303df-cb69-4f2f-909f-a1651d376adc"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.137541 4782 scope.go:117] "RemoveContainer" containerID="721c73bb806b44f16ca65b0e3d3bdf2b72298ad7af4c050b8fb757c30b1fb414" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.140802 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6272\" (UniqueName: \"kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272\") pod \"96b1ca41-cb51-47f5-a52e-fcfba8424503\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141064 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs\") pod \"96b1ca41-cb51-47f5-a52e-fcfba8424503\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141164 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data\") pod \"96b1ca41-cb51-47f5-a52e-fcfba8424503\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141277 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom\") pod \"96b1ca41-cb51-47f5-a52e-fcfba8424503\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141315 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle\") pod \"96b1ca41-cb51-47f5-a52e-fcfba8424503\" (UID: \"96b1ca41-cb51-47f5-a52e-fcfba8424503\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141776 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs" (OuterVolumeSpecName: "logs") pod "96b1ca41-cb51-47f5-a52e-fcfba8424503" (UID: "96b1ca41-cb51-47f5-a52e-fcfba8424503"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141964 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.141987 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp5sf\" (UniqueName: \"kubernetes.io/projected/b2e303df-cb69-4f2f-909f-a1651d376adc-kube-api-access-fp5sf\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142002 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142014 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b2e303df-cb69-4f2f-909f-a1651d376adc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142026 4782 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142038 4782 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e303df-cb69-4f2f-909f-a1651d376adc-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142050 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2e303df-cb69-4f2f-909f-a1651d376adc-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.142060 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96b1ca41-cb51-47f5-a52e-fcfba8424503-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.146792 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272" (OuterVolumeSpecName: "kube-api-access-v6272") pod "96b1ca41-cb51-47f5-a52e-fcfba8424503" (UID: "96b1ca41-cb51-47f5-a52e-fcfba8424503"). InnerVolumeSpecName "kube-api-access-v6272". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.149678 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "96b1ca41-cb51-47f5-a52e-fcfba8424503" (UID: "96b1ca41-cb51-47f5-a52e-fcfba8424503"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.171113 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96b1ca41-cb51-47f5-a52e-fcfba8424503" (UID: "96b1ca41-cb51-47f5-a52e-fcfba8424503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.193039 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data" (OuterVolumeSpecName: "config-data") pod "96b1ca41-cb51-47f5-a52e-fcfba8424503" (UID: "96b1ca41-cb51-47f5-a52e-fcfba8424503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.245785 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.245818 4782 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.245863 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96b1ca41-cb51-47f5-a52e-fcfba8424503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.245878 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6272\" (UniqueName: \"kubernetes.io/projected/96b1ca41-cb51-47f5-a52e-fcfba8424503-kube-api-access-v6272\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.295300 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6989f95847-z8k6r"] Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.374429 4782 scope.go:117] "RemoveContainer" containerID="441cb98b7bdcb9129ff7e175460fd76e74ce990bde17ddae86c8e1c5a95f2c84" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.391639 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.402747 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cdd7f6f98-s4js9"] Jan 30 18:50:21 crc kubenswrapper[4782]: W0130 18:50:21.409465 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd86a7921_fdce_4a73_ad98_4dc1373c72e2.slice/crio-e4c1e91d9aa761f834f2176aced2b3b61f9f0a59fe8c80d7fc932f60189531d0 WatchSource:0}: Error finding container e4c1e91d9aa761f834f2176aced2b3b61f9f0a59fe8c80d7fc932f60189531d0: Status 404 returned error can't find the container with id e4c1e91d9aa761f834f2176aced2b3b61f9f0a59fe8c80d7fc932f60189531d0 Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.424205 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.430904 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.434197 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b5464cf9b-2tbsc"] Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.554971 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555266 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555317 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555421 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xwm5\" (UniqueName: \"kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555441 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555504 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555545 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data\") pod \"b357c566-0063-4e60-b284-9d4e8911734d\" (UID: \"b357c566-0063-4e60-b284-9d4e8911734d\") " Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555629 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.555903 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.557433 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.558119 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts" (OuterVolumeSpecName: "scripts") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.559808 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5" (OuterVolumeSpecName: "kube-api-access-9xwm5") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "kube-api-access-9xwm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.591869 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.626349 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.658118 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.658152 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.658161 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwm5\" (UniqueName: \"kubernetes.io/projected/b357c566-0063-4e60-b284-9d4e8911734d-kube-api-access-9xwm5\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.658170 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.658178 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b357c566-0063-4e60-b284-9d4e8911734d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.665301 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data" (OuterVolumeSpecName: "config-data") pod "b357c566-0063-4e60-b284-9d4e8911734d" (UID: "b357c566-0063-4e60-b284-9d4e8911734d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.760273 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b357c566-0063-4e60-b284-9d4e8911734d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.916747 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mths6"] Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917109 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api-log" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917122 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api-log" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917141 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="sg-core" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917147 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="sg-core" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917154 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon-log" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917161 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon-log" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917185 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917191 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917201 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-central-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917207 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-central-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917216 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="proxy-httpd" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917221 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="proxy-httpd" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917250 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-notification-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917256 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-notification-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: E0130 18:50:21.917267 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917272 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917441 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-central-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917456 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="proxy-httpd" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917466 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="sg-core" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917478 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api-log" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917492 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917503 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" containerName="barbican-api" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917514 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" containerName="horizon-log" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.917521 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="ceilometer-notification-agent" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.918188 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.943455 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mths6"] Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.964387 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gf4\" (UniqueName: \"kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:21 crc kubenswrapper[4782]: I0130 18:50:21.964640 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.001730 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-csd5t"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.002973 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.009455 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2f36-account-create-update-5x2wl"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.010729 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.013008 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.032005 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-csd5t"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.057553 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f36-account-create-update-5x2wl"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.065452 4782 generic.go:334] "Generic (PLEG): container finished" podID="b357c566-0063-4e60-b284-9d4e8911734d" containerID="325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f" exitCode=0 Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.065497 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerDied","Data":"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f"} Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.065517 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b357c566-0063-4e60-b284-9d4e8911734d","Type":"ContainerDied","Data":"728d420c86f814ef192851b23b6445aaa96498e9a053f33a700f6554f613f4d4"} Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.065534 4782 scope.go:117] "RemoveContainer" containerID="5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.065631 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066000 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066062 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwb98\" (UniqueName: \"kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066204 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp8kh\" (UniqueName: \"kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066279 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066323 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gf4\" (UniqueName: \"kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.066508 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.068426 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.072606 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6989f95847-z8k6r" event={"ID":"d86a7921-fdce-4a73-ad98-4dc1373c72e2","Type":"ContainerStarted","Data":"4f769d092b8a865331d5489d3d484ba7846970f5b6d5b6aa82df979bef16a953"} Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.072667 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6989f95847-z8k6r" event={"ID":"d86a7921-fdce-4a73-ad98-4dc1373c72e2","Type":"ContainerStarted","Data":"e4c1e91d9aa761f834f2176aced2b3b61f9f0a59fe8c80d7fc932f60189531d0"} Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.102299 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gf4\" (UniqueName: \"kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4\") pod \"nova-api-db-create-mths6\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.108390 4782 scope.go:117] "RemoveContainer" containerID="288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.120389 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.136843 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.151754 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.153935 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.156631 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.168548 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp8kh\" (UniqueName: \"kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.168792 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.168882 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.169215 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.169352 4782 scope.go:117] "RemoveContainer" containerID="325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.169445 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwb98\" (UniqueName: \"kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.170439 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.170902 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.175393 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.202070 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp8kh\" (UniqueName: \"kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh\") pod \"nova-cell0-db-create-csd5t\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.217755 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwb98\" (UniqueName: \"kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98\") pod \"nova-api-2f36-account-create-update-5x2wl\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.235509 4782 scope.go:117] "RemoveContainer" containerID="47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.235863 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.237405 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4k56s"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.239240 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.247848 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4ad3-account-create-update-t6n2b"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.252875 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.255580 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.258891 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4k56s"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290035 4782 scope.go:117] "RemoveContainer" containerID="5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290491 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290573 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290648 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290720 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290955 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.290973 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cktpx\" (UniqueName: \"kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.291114 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.294517 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4ad3-account-create-update-t6n2b"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.318061 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:22 crc kubenswrapper[4782]: E0130 18:50:22.318624 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89\": container with ID starting with 5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89 not found: ID does not exist" containerID="5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.321887 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89"} err="failed to get container status \"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89\": rpc error: code = NotFound desc = could not find container \"5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89\": container with ID starting with 5a908ccc18e5b28b83fbb7254d25f115d4b94e467b9d34faa50ef1fa328d6d89 not found: ID does not exist" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.322366 4782 scope.go:117] "RemoveContainer" containerID="288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a" Jan 30 18:50:22 crc kubenswrapper[4782]: E0130 18:50:22.326221 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a\": container with ID starting with 288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a not found: ID does not exist" containerID="288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.326385 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a"} err="failed to get container status \"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a\": rpc error: code = NotFound desc = could not find container \"288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a\": container with ID starting with 288a0766f0615d3e226499053c8b911bfa2902b99d21e3850d4aed93a766011a not found: ID does not exist" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.326477 4782 scope.go:117] "RemoveContainer" containerID="325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f" Jan 30 18:50:22 crc kubenswrapper[4782]: E0130 18:50:22.326772 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f\": container with ID starting with 325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f not found: ID does not exist" containerID="325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.326865 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f"} err="failed to get container status \"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f\": rpc error: code = NotFound desc = could not find container \"325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f\": container with ID starting with 325fe570b72e58636313cdef674ab7cc4101afa79d4a548aabd8eb1b5616fe2f not found: ID does not exist" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.326930 4782 scope.go:117] "RemoveContainer" containerID="47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6" Jan 30 18:50:22 crc kubenswrapper[4782]: E0130 18:50:22.327220 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6\": container with ID starting with 47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6 not found: ID does not exist" containerID="47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.327352 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6"} err="failed to get container status \"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6\": rpc error: code = NotFound desc = could not find container \"47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6\": container with ID starting with 47eee7b5bf287ad825b139148d5f73d2bcb3619779fd3cbe93b6d9a8e68ff0b6 not found: ID does not exist" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.351503 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.392690 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395160 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cktpx\" (UniqueName: \"kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395201 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395266 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknrj\" (UniqueName: \"kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395319 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395385 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km49j\" (UniqueName: \"kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395424 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395456 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395487 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395507 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.395538 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.398130 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.399582 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.416214 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.416403 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.423364 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.426732 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cktpx\" (UniqueName: \"kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.427901 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts\") pod \"ceilometer-0\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.451163 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b1ca41-cb51-47f5-a52e-fcfba8424503" path="/var/lib/kubelet/pods/96b1ca41-cb51-47f5-a52e-fcfba8424503/volumes" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.452258 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e303df-cb69-4f2f-909f-a1651d376adc" path="/var/lib/kubelet/pods/b2e303df-cb69-4f2f-909f-a1651d376adc/volumes" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.453018 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b357c566-0063-4e60-b284-9d4e8911734d" path="/var/lib/kubelet/pods/b357c566-0063-4e60-b284-9d4e8911734d/volumes" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.454567 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4a3d-account-create-update-h2xvb"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.456180 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4a3d-account-create-update-h2xvb"] Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.456621 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.459848 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.498698 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km49j\" (UniqueName: \"kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.499057 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.499220 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.499360 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknrj\" (UniqueName: \"kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.501206 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.501823 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.506368 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.525204 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknrj\" (UniqueName: \"kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj\") pod \"nova-cell1-db-create-4k56s\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.533302 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km49j\" (UniqueName: \"kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j\") pod \"nova-cell0-4ad3-account-create-update-t6n2b\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.556796 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.574739 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.601674 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwrv\" (UniqueName: \"kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.601726 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.705565 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwrv\" (UniqueName: \"kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.705640 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.707792 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:22 crc kubenswrapper[4782]: I0130 18:50:22.755824 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwrv\" (UniqueName: \"kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv\") pod \"nova-cell1-4a3d-account-create-update-h2xvb\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:22.884366 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.084476 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6989f95847-z8k6r" event={"ID":"d86a7921-fdce-4a73-ad98-4dc1373c72e2","Type":"ContainerStarted","Data":"650b87f245baa929df4a5eecf090f20a7299daca38fd3e62705b20f6a42d6fdb"} Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.084602 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.103757 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6989f95847-z8k6r" podStartSLOduration=8.103737685 podStartE2EDuration="8.103737685s" podCreationTimestamp="2026-01-30 18:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:23.102195777 +0000 UTC m=+1199.370573792" watchObservedRunningTime="2026-01-30 18:50:23.103737685 +0000 UTC m=+1199.372115700" Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.143743 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mths6"] Jan 30 18:50:23 crc kubenswrapper[4782]: W0130 18:50:23.144164 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d5e6d06_a8c5_4789_8e21_6aba18cb8088.slice/crio-acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9 WatchSource:0}: Error finding container acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9: Status 404 returned error can't find the container with id acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9 Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.300508 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-csd5t"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.324717 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f36-account-create-update-5x2wl"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.538905 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4ad3-account-create-update-t6n2b"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.563030 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4k56s"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.642971 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.657936 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.693677 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4a3d-account-create-update-h2xvb"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.961847 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.985797 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:23 crc kubenswrapper[4782]: I0130 18:50:23.986060 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a89d76b9-7010-4d8b-ac8e-fac56394928d" containerName="kube-state-metrics" containerID="cri-o://d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7" gracePeriod=30 Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.103659 4782 generic.go:334] "Generic (PLEG): container finished" podID="04bb697c-568d-47bc-abb6-56dc09be923d" containerID="6b13dfb69af04f63ac75806c7c8b1e99a29a7bc41f24744cc29ba67a5daa914a" exitCode=0 Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.103734 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csd5t" event={"ID":"04bb697c-568d-47bc-abb6-56dc09be923d","Type":"ContainerDied","Data":"6b13dfb69af04f63ac75806c7c8b1e99a29a7bc41f24744cc29ba67a5daa914a"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.103761 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csd5t" event={"ID":"04bb697c-568d-47bc-abb6-56dc09be923d","Type":"ContainerStarted","Data":"34f553faf3588a0fcca4d7e8797cce4857be78fba14dacf80c89e46aff37eed9"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.107016 4782 generic.go:334] "Generic (PLEG): container finished" podID="233df4b9-7dfd-4817-b0fc-51db7b8d77d1" containerID="19a09d398e1dd3e57b06678a11dd97bbfe9cdf78a150eea9ea7b8d0b3a907745" exitCode=0 Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.107116 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f36-account-create-update-5x2wl" event={"ID":"233df4b9-7dfd-4817-b0fc-51db7b8d77d1","Type":"ContainerDied","Data":"19a09d398e1dd3e57b06678a11dd97bbfe9cdf78a150eea9ea7b8d0b3a907745"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.107145 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f36-account-create-update-5x2wl" event={"ID":"233df4b9-7dfd-4817-b0fc-51db7b8d77d1","Type":"ContainerStarted","Data":"bdf29516469368bf8dc785772f563d3fb82cc49a730858ee7ef3f9fbb5f8df19"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.116801 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4k56s" event={"ID":"24d517ad-1f63-4dc7-9893-b686884dc3d8","Type":"ContainerStarted","Data":"80d8d94362d353c3ee5e16fb0223a49de80f56dcdac5824fda2ec39c90cb2bd3"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.116841 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4k56s" event={"ID":"24d517ad-1f63-4dc7-9893-b686884dc3d8","Type":"ContainerStarted","Data":"30c74568813f04d1a5e7a4d7ee9a8b2d68bd04aac9bd61502398b7a4c2dd6fce"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.118320 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerStarted","Data":"8f47df6de6800f44213037898e2ff18f3115b7fd8eb63241bba0f07ffec58eca"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.118342 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerStarted","Data":"7d91dde19e5f39479215600e0ee55978d306c09ab5d4d0cc63cf457f20b4d606"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.127479 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" event={"ID":"6bdb65b3-6289-42fb-8b67-290b3b72cb4f","Type":"ContainerStarted","Data":"b6d3219b7f40a2655f11e2a18a6aedf81bf3a961c4d7ecf9036790423358269e"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.127517 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" event={"ID":"6bdb65b3-6289-42fb-8b67-290b3b72cb4f","Type":"ContainerStarted","Data":"306358912bcb83a5a7a8a187a5a790a82c58d1c793500419909e356149dd946c"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.140444 4782 generic.go:334] "Generic (PLEG): container finished" podID="0d5e6d06-a8c5-4789-8e21-6aba18cb8088" containerID="8c2546defb9d2bc5948dce22e653318e7e09cdfcfdad5b9843d942651fe66f78" exitCode=0 Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.140513 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mths6" event={"ID":"0d5e6d06-a8c5-4789-8e21-6aba18cb8088","Type":"ContainerDied","Data":"8c2546defb9d2bc5948dce22e653318e7e09cdfcfdad5b9843d942651fe66f78"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.140537 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mths6" event={"ID":"0d5e6d06-a8c5-4789-8e21-6aba18cb8088","Type":"ContainerStarted","Data":"acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.180829 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" event={"ID":"d0185b2b-ed05-407b-93ee-3f5e83ee630a","Type":"ContainerStarted","Data":"29010e56782893fa617a8e2c0cd0c5a188f249db8d97bd95cd27e576509b2873"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.181198 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.181218 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" event={"ID":"d0185b2b-ed05-407b-93ee-3f5e83ee630a","Type":"ContainerStarted","Data":"a6419dc718a78f652060122a4ad24f8da0ceb3bde5491efcca68fd284b31fd2f"} Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.187096 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-4k56s" podStartSLOduration=2.187074463 podStartE2EDuration="2.187074463s" podCreationTimestamp="2026-01-30 18:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:24.179577088 +0000 UTC m=+1200.447955103" watchObservedRunningTime="2026-01-30 18:50:24.187074463 +0000 UTC m=+1200.455452488" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.317082 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" podStartSLOduration=2.317058737 podStartE2EDuration="2.317058737s" podCreationTimestamp="2026-01-30 18:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:24.220639293 +0000 UTC m=+1200.489017318" watchObservedRunningTime="2026-01-30 18:50:24.317058737 +0000 UTC m=+1200.585436762" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.340499 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" podStartSLOduration=2.340479017 podStartE2EDuration="2.340479017s" podCreationTimestamp="2026-01-30 18:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:24.247078477 +0000 UTC m=+1200.515456592" watchObservedRunningTime="2026-01-30 18:50:24.340479017 +0000 UTC m=+1200.608857042" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.628910 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.682657 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mfqf\" (UniqueName: \"kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf\") pod \"a89d76b9-7010-4d8b-ac8e-fac56394928d\" (UID: \"a89d76b9-7010-4d8b-ac8e-fac56394928d\") " Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.687751 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf" (OuterVolumeSpecName: "kube-api-access-2mfqf") pod "a89d76b9-7010-4d8b-ac8e-fac56394928d" (UID: "a89d76b9-7010-4d8b-ac8e-fac56394928d"). InnerVolumeSpecName "kube-api-access-2mfqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:24 crc kubenswrapper[4782]: I0130 18:50:24.784646 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mfqf\" (UniqueName: \"kubernetes.io/projected/a89d76b9-7010-4d8b-ac8e-fac56394928d-kube-api-access-2mfqf\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.197148 4782 generic.go:334] "Generic (PLEG): container finished" podID="d0185b2b-ed05-407b-93ee-3f5e83ee630a" containerID="29010e56782893fa617a8e2c0cd0c5a188f249db8d97bd95cd27e576509b2873" exitCode=0 Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.197378 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" event={"ID":"d0185b2b-ed05-407b-93ee-3f5e83ee630a","Type":"ContainerDied","Data":"29010e56782893fa617a8e2c0cd0c5a188f249db8d97bd95cd27e576509b2873"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.201109 4782 generic.go:334] "Generic (PLEG): container finished" podID="24d517ad-1f63-4dc7-9893-b686884dc3d8" containerID="80d8d94362d353c3ee5e16fb0223a49de80f56dcdac5824fda2ec39c90cb2bd3" exitCode=0 Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.201318 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4k56s" event={"ID":"24d517ad-1f63-4dc7-9893-b686884dc3d8","Type":"ContainerDied","Data":"80d8d94362d353c3ee5e16fb0223a49de80f56dcdac5824fda2ec39c90cb2bd3"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.202898 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerStarted","Data":"cc219bed63745192a932a2bfe6e899049777cfffcdbbaf355bc5dec5b3eb6bd3"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.202958 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerStarted","Data":"66a3f3e53dc4f53f6e95cad14167d3a806a101b20e33999590a85fe168a4aac6"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.210537 4782 generic.go:334] "Generic (PLEG): container finished" podID="6bdb65b3-6289-42fb-8b67-290b3b72cb4f" containerID="b6d3219b7f40a2655f11e2a18a6aedf81bf3a961c4d7ecf9036790423358269e" exitCode=0 Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.210639 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" event={"ID":"6bdb65b3-6289-42fb-8b67-290b3b72cb4f","Type":"ContainerDied","Data":"b6d3219b7f40a2655f11e2a18a6aedf81bf3a961c4d7ecf9036790423358269e"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.214544 4782 generic.go:334] "Generic (PLEG): container finished" podID="a89d76b9-7010-4d8b-ac8e-fac56394928d" containerID="d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7" exitCode=2 Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.214580 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.214663 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a89d76b9-7010-4d8b-ac8e-fac56394928d","Type":"ContainerDied","Data":"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.214718 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a89d76b9-7010-4d8b-ac8e-fac56394928d","Type":"ContainerDied","Data":"2f39b6e653540ebf2b4f78792590051ee026fb66fac3ea5c6da5cddf04ef014b"} Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.214741 4782 scope.go:117] "RemoveContainer" containerID="d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.281360 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.288431 4782 scope.go:117] "RemoveContainer" containerID="d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7" Jan 30 18:50:25 crc kubenswrapper[4782]: E0130 18:50:25.295305 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7\": container with ID starting with d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7 not found: ID does not exist" containerID="d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.295346 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7"} err="failed to get container status \"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7\": rpc error: code = NotFound desc = could not find container \"d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7\": container with ID starting with d584f411cb82cff5bf2729e277f605c9a64d2bd0a6ecd42552c6720961fccac7 not found: ID does not exist" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.300839 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.311306 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:25 crc kubenswrapper[4782]: E0130 18:50:25.311778 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89d76b9-7010-4d8b-ac8e-fac56394928d" containerName="kube-state-metrics" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.311796 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d76b9-7010-4d8b-ac8e-fac56394928d" containerName="kube-state-metrics" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.312971 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89d76b9-7010-4d8b-ac8e-fac56394928d" containerName="kube-state-metrics" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.313612 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.323006 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.325289 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.358484 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.394662 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzg9f\" (UniqueName: \"kubernetes.io/projected/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-api-access-mzg9f\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.394771 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.395931 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.395990 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: E0130 18:50:25.442675 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda89d76b9_7010_4d8b_ac8e_fac56394928d.slice/crio-2f39b6e653540ebf2b4f78792590051ee026fb66fac3ea5c6da5cddf04ef014b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda89d76b9_7010_4d8b_ac8e_fac56394928d.slice\": RecentStats: unable to find data in memory cache]" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.499491 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.500571 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.500615 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.501107 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzg9f\" (UniqueName: \"kubernetes.io/projected/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-api-access-mzg9f\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.505864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.508064 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.519187 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.556859 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzg9f\" (UniqueName: \"kubernetes.io/projected/2faa1c8b-e69c-4b72-bc58-0d1a5e032d52-kube-api-access-mzg9f\") pod \"kube-state-metrics-0\" (UID: \"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52\") " pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.658682 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.814817 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.905663 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.919303 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.920737 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp8kh\" (UniqueName: \"kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh\") pod \"04bb697c-568d-47bc-abb6-56dc09be923d\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.920796 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts\") pod \"04bb697c-568d-47bc-abb6-56dc09be923d\" (UID: \"04bb697c-568d-47bc-abb6-56dc09be923d\") " Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.922412 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04bb697c-568d-47bc-abb6-56dc09be923d" (UID: "04bb697c-568d-47bc-abb6-56dc09be923d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:25 crc kubenswrapper[4782]: I0130 18:50:25.927323 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh" (OuterVolumeSpecName: "kube-api-access-vp8kh") pod "04bb697c-568d-47bc-abb6-56dc09be923d" (UID: "04bb697c-568d-47bc-abb6-56dc09be923d"). InnerVolumeSpecName "kube-api-access-vp8kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.022973 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gf4\" (UniqueName: \"kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4\") pod \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.023048 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts\") pod \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.023076 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwb98\" (UniqueName: \"kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98\") pod \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\" (UID: \"233df4b9-7dfd-4817-b0fc-51db7b8d77d1\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.023363 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts\") pod \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\" (UID: \"0d5e6d06-a8c5-4789-8e21-6aba18cb8088\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.023847 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d5e6d06-a8c5-4789-8e21-6aba18cb8088" (UID: "0d5e6d06-a8c5-4789-8e21-6aba18cb8088"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.024221 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.024249 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp8kh\" (UniqueName: \"kubernetes.io/projected/04bb697c-568d-47bc-abb6-56dc09be923d-kube-api-access-vp8kh\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.024259 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04bb697c-568d-47bc-abb6-56dc09be923d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.024612 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "233df4b9-7dfd-4817-b0fc-51db7b8d77d1" (UID: "233df4b9-7dfd-4817-b0fc-51db7b8d77d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.026567 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4" (OuterVolumeSpecName: "kube-api-access-m2gf4") pod "0d5e6d06-a8c5-4789-8e21-6aba18cb8088" (UID: "0d5e6d06-a8c5-4789-8e21-6aba18cb8088"). InnerVolumeSpecName "kube-api-access-m2gf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.028543 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98" (OuterVolumeSpecName: "kube-api-access-qwb98") pod "233df4b9-7dfd-4817-b0fc-51db7b8d77d1" (UID: "233df4b9-7dfd-4817-b0fc-51db7b8d77d1"). InnerVolumeSpecName "kube-api-access-qwb98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.125949 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gf4\" (UniqueName: \"kubernetes.io/projected/0d5e6d06-a8c5-4789-8e21-6aba18cb8088-kube-api-access-m2gf4\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.125977 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.125987 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwb98\" (UniqueName: \"kubernetes.io/projected/233df4b9-7dfd-4817-b0fc-51db7b8d77d1-kube-api-access-qwb98\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.200749 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.229615 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-csd5t" event={"ID":"04bb697c-568d-47bc-abb6-56dc09be923d","Type":"ContainerDied","Data":"34f553faf3588a0fcca4d7e8797cce4857be78fba14dacf80c89e46aff37eed9"} Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.229671 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-csd5t" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.229682 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34f553faf3588a0fcca4d7e8797cce4857be78fba14dacf80c89e46aff37eed9" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.239566 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f36-account-create-update-5x2wl" event={"ID":"233df4b9-7dfd-4817-b0fc-51db7b8d77d1","Type":"ContainerDied","Data":"bdf29516469368bf8dc785772f563d3fb82cc49a730858ee7ef3f9fbb5f8df19"} Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.239604 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf29516469368bf8dc785772f563d3fb82cc49a730858ee7ef3f9fbb5f8df19" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.239670 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f36-account-create-update-5x2wl" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.242212 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mths6" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.242245 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mths6" event={"ID":"0d5e6d06-a8c5-4789-8e21-6aba18cb8088","Type":"ContainerDied","Data":"acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9"} Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.242438 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acbe585ab7603878fdf93ac53e287934a3fb8c7f2af7f5166e15c3f3ff498db9" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.303004 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.325015 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-695d477669-wlmct" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.427612 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89d76b9-7010-4d8b-ac8e-fac56394928d" path="/var/lib/kubelet/pods/a89d76b9-7010-4d8b-ac8e-fac56394928d/volumes" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.428130 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.428349 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64684dfb44-vvmcx" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-api" containerID="cri-o://a873fad9675ee26d9360b345482b465a6a465f5d77368b767258ac24b5b72bf2" gracePeriod=30 Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.428454 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64684dfb44-vvmcx" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-httpd" containerID="cri-o://9454f90cec4c047dd7f4d13b8551aa39d63486a6d2cdfa864f0b839315136218" gracePeriod=30 Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.765072 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.769935 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.800539 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841123 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts\") pod \"24d517ad-1f63-4dc7-9893-b686884dc3d8\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841365 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts\") pod \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841462 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts\") pod \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841490 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km49j\" (UniqueName: \"kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j\") pod \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\" (UID: \"6bdb65b3-6289-42fb-8b67-290b3b72cb4f\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841506 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mwrv\" (UniqueName: \"kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv\") pod \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\" (UID: \"d0185b2b-ed05-407b-93ee-3f5e83ee630a\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.841526 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rknrj\" (UniqueName: \"kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj\") pod \"24d517ad-1f63-4dc7-9893-b686884dc3d8\" (UID: \"24d517ad-1f63-4dc7-9893-b686884dc3d8\") " Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.842617 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0185b2b-ed05-407b-93ee-3f5e83ee630a" (UID: "d0185b2b-ed05-407b-93ee-3f5e83ee630a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.843577 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24d517ad-1f63-4dc7-9893-b686884dc3d8" (UID: "24d517ad-1f63-4dc7-9893-b686884dc3d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.844099 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bdb65b3-6289-42fb-8b67-290b3b72cb4f" (UID: "6bdb65b3-6289-42fb-8b67-290b3b72cb4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.856938 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj" (OuterVolumeSpecName: "kube-api-access-rknrj") pod "24d517ad-1f63-4dc7-9893-b686884dc3d8" (UID: "24d517ad-1f63-4dc7-9893-b686884dc3d8"). InnerVolumeSpecName "kube-api-access-rknrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.859683 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j" (OuterVolumeSpecName: "kube-api-access-km49j") pod "6bdb65b3-6289-42fb-8b67-290b3b72cb4f" (UID: "6bdb65b3-6289-42fb-8b67-290b3b72cb4f"). InnerVolumeSpecName "kube-api-access-km49j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.860756 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv" (OuterVolumeSpecName: "kube-api-access-8mwrv") pod "d0185b2b-ed05-407b-93ee-3f5e83ee630a" (UID: "d0185b2b-ed05-407b-93ee-3f5e83ee630a"). InnerVolumeSpecName "kube-api-access-8mwrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943146 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0185b2b-ed05-407b-93ee-3f5e83ee630a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943176 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km49j\" (UniqueName: \"kubernetes.io/projected/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-kube-api-access-km49j\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943186 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mwrv\" (UniqueName: \"kubernetes.io/projected/d0185b2b-ed05-407b-93ee-3f5e83ee630a-kube-api-access-8mwrv\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943195 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rknrj\" (UniqueName: \"kubernetes.io/projected/24d517ad-1f63-4dc7-9893-b686884dc3d8-kube-api-access-rknrj\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943204 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d517ad-1f63-4dc7-9893-b686884dc3d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:26 crc kubenswrapper[4782]: I0130 18:50:26.943213 4782 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bdb65b3-6289-42fb-8b67-290b3b72cb4f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.267811 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" event={"ID":"6bdb65b3-6289-42fb-8b67-290b3b72cb4f","Type":"ContainerDied","Data":"306358912bcb83a5a7a8a187a5a790a82c58d1c793500419909e356149dd946c"} Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.268189 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306358912bcb83a5a7a8a187a5a790a82c58d1c793500419909e356149dd946c" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.267831 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ad3-account-create-update-t6n2b" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.275629 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52","Type":"ContainerStarted","Data":"c8bc4526d86ba685c712e1db849e2a630ee53fde2d28e4604953dd2480bacd37"} Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.283010 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" event={"ID":"d0185b2b-ed05-407b-93ee-3f5e83ee630a","Type":"ContainerDied","Data":"a6419dc718a78f652060122a4ad24f8da0ceb3bde5491efcca68fd284b31fd2f"} Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.283057 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6419dc718a78f652060122a4ad24f8da0ceb3bde5491efcca68fd284b31fd2f" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.283166 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4a3d-account-create-update-h2xvb" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.287712 4782 generic.go:334] "Generic (PLEG): container finished" podID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerID="9454f90cec4c047dd7f4d13b8551aa39d63486a6d2cdfa864f0b839315136218" exitCode=0 Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.287788 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerDied","Data":"9454f90cec4c047dd7f4d13b8551aa39d63486a6d2cdfa864f0b839315136218"} Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.290737 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4k56s" event={"ID":"24d517ad-1f63-4dc7-9893-b686884dc3d8","Type":"ContainerDied","Data":"30c74568813f04d1a5e7a4d7ee9a8b2d68bd04aac9bd61502398b7a4c2dd6fce"} Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.290866 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c74568813f04d1a5e7a4d7ee9a8b2d68bd04aac9bd61502398b7a4c2dd6fce" Jan 30 18:50:27 crc kubenswrapper[4782]: I0130 18:50:27.290804 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4k56s" Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.305620 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerStarted","Data":"7e1045b68795a7de17d5b661893d08a5a1e1a4f1f651f8c5064071ad408f4668"} Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.306135 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.305772 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="sg-core" containerID="cri-o://cc219bed63745192a932a2bfe6e899049777cfffcdbbaf355bc5dec5b3eb6bd3" gracePeriod=30 Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.305718 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-central-agent" containerID="cri-o://8f47df6de6800f44213037898e2ff18f3115b7fd8eb63241bba0f07ffec58eca" gracePeriod=30 Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.305822 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="proxy-httpd" containerID="cri-o://7e1045b68795a7de17d5b661893d08a5a1e1a4f1f651f8c5064071ad408f4668" gracePeriod=30 Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.305837 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-notification-agent" containerID="cri-o://66a3f3e53dc4f53f6e95cad14167d3a806a101b20e33999590a85fe168a4aac6" gracePeriod=30 Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.312710 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2faa1c8b-e69c-4b72-bc58-0d1a5e032d52","Type":"ContainerStarted","Data":"556e90b9ed9366b30f409260a10ce91e7277a060513da3c6b1807128d8c584e6"} Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.312926 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 18:50:28 crc kubenswrapper[4782]: I0130 18:50:28.337077 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.480728785 podStartE2EDuration="6.337057073s" podCreationTimestamp="2026-01-30 18:50:22 +0000 UTC" firstStartedPulling="2026-01-30 18:50:23.657717923 +0000 UTC m=+1199.926095948" lastFinishedPulling="2026-01-30 18:50:27.514046211 +0000 UTC m=+1203.782424236" observedRunningTime="2026-01-30 18:50:28.329803714 +0000 UTC m=+1204.598181769" watchObservedRunningTime="2026-01-30 18:50:28.337057073 +0000 UTC m=+1204.605435108" Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336028 4782 generic.go:334] "Generic (PLEG): container finished" podID="d2418592-40ea-4fe5-91f1-134a6635f724" containerID="7e1045b68795a7de17d5b661893d08a5a1e1a4f1f651f8c5064071ad408f4668" exitCode=0 Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336060 4782 generic.go:334] "Generic (PLEG): container finished" podID="d2418592-40ea-4fe5-91f1-134a6635f724" containerID="cc219bed63745192a932a2bfe6e899049777cfffcdbbaf355bc5dec5b3eb6bd3" exitCode=2 Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336070 4782 generic.go:334] "Generic (PLEG): container finished" podID="d2418592-40ea-4fe5-91f1-134a6635f724" containerID="66a3f3e53dc4f53f6e95cad14167d3a806a101b20e33999590a85fe168a4aac6" exitCode=0 Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336128 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerDied","Data":"7e1045b68795a7de17d5b661893d08a5a1e1a4f1f651f8c5064071ad408f4668"} Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336194 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerDied","Data":"cc219bed63745192a932a2bfe6e899049777cfffcdbbaf355bc5dec5b3eb6bd3"} Jan 30 18:50:29 crc kubenswrapper[4782]: I0130 18:50:29.336216 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerDied","Data":"66a3f3e53dc4f53f6e95cad14167d3a806a101b20e33999590a85fe168a4aac6"} Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.177381 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6989f95847-z8k6r" Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.218701 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=5.542245961 podStartE2EDuration="6.218671408s" podCreationTimestamp="2026-01-30 18:50:25 +0000 UTC" firstStartedPulling="2026-01-30 18:50:26.324542698 +0000 UTC m=+1202.592920723" lastFinishedPulling="2026-01-30 18:50:27.000968135 +0000 UTC m=+1203.269346170" observedRunningTime="2026-01-30 18:50:28.349984283 +0000 UTC m=+1204.618362308" watchObservedRunningTime="2026-01-30 18:50:31.218671408 +0000 UTC m=+1207.487049473" Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.361149 4782 generic.go:334] "Generic (PLEG): container finished" podID="d2418592-40ea-4fe5-91f1-134a6635f724" containerID="8f47df6de6800f44213037898e2ff18f3115b7fd8eb63241bba0f07ffec58eca" exitCode=0 Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.361275 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerDied","Data":"8f47df6de6800f44213037898e2ff18f3115b7fd8eb63241bba0f07ffec58eca"} Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.364155 4782 generic.go:334] "Generic (PLEG): container finished" podID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerID="a873fad9675ee26d9360b345482b465a6a465f5d77368b767258ac24b5b72bf2" exitCode=0 Jan 30 18:50:31 crc kubenswrapper[4782]: I0130 18:50:31.364190 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerDied","Data":"a873fad9675ee26d9360b345482b465a6a465f5d77368b767258ac24b5b72bf2"} Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.279476 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.353891 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354012 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354065 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354123 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354136 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354205 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.354596 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cktpx\" (UniqueName: \"kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx\") pod \"d2418592-40ea-4fe5-91f1-134a6635f724\" (UID: \"d2418592-40ea-4fe5-91f1-134a6635f724\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.356467 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.357015 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.371930 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx" (OuterVolumeSpecName: "kube-api-access-cktpx") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "kube-api-access-cktpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.382380 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts" (OuterVolumeSpecName: "scripts") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.386635 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64684dfb44-vvmcx" event={"ID":"076239f6-df06-4f9a-a0e6-70413767a0c9","Type":"ContainerDied","Data":"931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634"} Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.386675 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="931ab4d9e4325700563ffaf1747d0f28141b450cc79b960e97c218c7471c4634" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.392820 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2418592-40ea-4fe5-91f1-134a6635f724","Type":"ContainerDied","Data":"7d91dde19e5f39479215600e0ee55978d306c09ab5d4d0cc63cf457f20b4d606"} Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.392853 4782 scope.go:117] "RemoveContainer" containerID="7e1045b68795a7de17d5b661893d08a5a1e1a4f1f651f8c5064071ad408f4668" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.393002 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.410951 4782 scope.go:117] "RemoveContainer" containerID="8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.457197 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.457253 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.457269 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2418592-40ea-4fe5-91f1-134a6635f724-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.457282 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cktpx\" (UniqueName: \"kubernetes.io/projected/d2418592-40ea-4fe5-91f1-134a6635f724-kube-api-access-cktpx\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.472103 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.472377 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.482428 4782 scope.go:117] "RemoveContainer" containerID="cc219bed63745192a932a2bfe6e899049777cfffcdbbaf355bc5dec5b3eb6bd3" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.501440 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.503636 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsq8m"] Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504094 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-api" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504111 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-api" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504123 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bdb65b3-6289-42fb-8b67-290b3b72cb4f" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504133 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bdb65b3-6289-42fb-8b67-290b3b72cb4f" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504151 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233df4b9-7dfd-4817-b0fc-51db7b8d77d1" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504159 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="233df4b9-7dfd-4817-b0fc-51db7b8d77d1" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504167 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504174 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504191 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="proxy-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504197 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="proxy-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504206 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="sg-core" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504211 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="sg-core" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504238 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5e6d06-a8c5-4789-8e21-6aba18cb8088" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504245 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5e6d06-a8c5-4789-8e21-6aba18cb8088" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504260 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0185b2b-ed05-407b-93ee-3f5e83ee630a" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504267 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0185b2b-ed05-407b-93ee-3f5e83ee630a" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504277 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-central-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504283 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-central-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504298 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-notification-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504304 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-notification-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504316 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bb697c-568d-47bc-abb6-56dc09be923d" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504322 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bb697c-568d-47bc-abb6-56dc09be923d" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: E0130 18:50:32.504329 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d517ad-1f63-4dc7-9893-b686884dc3d8" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504335 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d517ad-1f63-4dc7-9893-b686884dc3d8" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504504 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0185b2b-ed05-407b-93ee-3f5e83ee630a" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504517 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-api" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504524 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="proxy-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504536 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bb697c-568d-47bc-abb6-56dc09be923d" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504545 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bdb65b3-6289-42fb-8b67-290b3b72cb4f" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504558 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" containerName="neutron-httpd" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504577 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d517ad-1f63-4dc7-9893-b686884dc3d8" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504585 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5e6d06-a8c5-4789-8e21-6aba18cb8088" containerName="mariadb-database-create" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504594 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-notification-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504602 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="sg-core" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504629 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="233df4b9-7dfd-4817-b0fc-51db7b8d77d1" containerName="mariadb-account-create-update" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.504638 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" containerName="ceilometer-central-agent" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.505301 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.512102 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.512255 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.512405 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jdpv6" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.522570 4782 scope.go:117] "RemoveContainer" containerID="66a3f3e53dc4f53f6e95cad14167d3a806a101b20e33999590a85fe168a4aac6" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.532285 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsq8m"] Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.558183 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config\") pod \"076239f6-df06-4f9a-a0e6-70413767a0c9\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.558540 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle\") pod \"076239f6-df06-4f9a-a0e6-70413767a0c9\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.558665 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvzzh\" (UniqueName: \"kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh\") pod \"076239f6-df06-4f9a-a0e6-70413767a0c9\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.558826 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config\") pod \"076239f6-df06-4f9a-a0e6-70413767a0c9\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.558922 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs\") pod \"076239f6-df06-4f9a-a0e6-70413767a0c9\" (UID: \"076239f6-df06-4f9a-a0e6-70413767a0c9\") " Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.559558 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.559641 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.559951 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq8ww\" (UniqueName: \"kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.560139 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.560350 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.560369 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.561535 4782 scope.go:117] "RemoveContainer" containerID="8f47df6de6800f44213037898e2ff18f3115b7fd8eb63241bba0f07ffec58eca" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.561724 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh" (OuterVolumeSpecName: "kube-api-access-jvzzh") pod "076239f6-df06-4f9a-a0e6-70413767a0c9" (UID: "076239f6-df06-4f9a-a0e6-70413767a0c9"). InnerVolumeSpecName "kube-api-access-jvzzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.562930 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "076239f6-df06-4f9a-a0e6-70413767a0c9" (UID: "076239f6-df06-4f9a-a0e6-70413767a0c9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.587778 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data" (OuterVolumeSpecName: "config-data") pod "d2418592-40ea-4fe5-91f1-134a6635f724" (UID: "d2418592-40ea-4fe5-91f1-134a6635f724"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.650326 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config" (OuterVolumeSpecName: "config") pod "076239f6-df06-4f9a-a0e6-70413767a0c9" (UID: "076239f6-df06-4f9a-a0e6-70413767a0c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.653825 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076239f6-df06-4f9a-a0e6-70413767a0c9" (UID: "076239f6-df06-4f9a-a0e6-70413767a0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661669 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661740 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661781 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661855 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq8ww\" (UniqueName: \"kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661934 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661945 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661957 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvzzh\" (UniqueName: \"kubernetes.io/projected/076239f6-df06-4f9a-a0e6-70413767a0c9-kube-api-access-jvzzh\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661969 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2418592-40ea-4fe5-91f1-134a6635f724-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.661978 4782 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.665636 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.666009 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.667113 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.680531 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq8ww\" (UniqueName: \"kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww\") pod \"nova-cell0-conductor-db-sync-wsq8m\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.681018 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "076239f6-df06-4f9a-a0e6-70413767a0c9" (UID: "076239f6-df06-4f9a-a0e6-70413767a0c9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.727901 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.737016 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.749694 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.751847 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.754567 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.754813 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.754912 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.763431 4782 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076239f6-df06-4f9a-a0e6-70413767a0c9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.764753 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.847678 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870267 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870338 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr97x\" (UniqueName: \"kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870364 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870421 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870462 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870480 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870506 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.870536 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972104 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972380 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972407 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972438 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr97x\" (UniqueName: \"kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972554 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.972607 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.973120 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.975890 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.979935 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.980331 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.980339 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.985776 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.992625 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr97x\" (UniqueName: \"kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:32 crc kubenswrapper[4782]: I0130 18:50:32.996100 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " pod="openstack/ceilometer-0" Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.066381 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.306578 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsq8m"] Jan 30 18:50:33 crc kubenswrapper[4782]: W0130 18:50:33.318542 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod616ca793_5768_474b_b80c_c29026c68bd6.slice/crio-a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747 WatchSource:0}: Error finding container a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747: Status 404 returned error can't find the container with id a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747 Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.411694 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" event={"ID":"616ca793-5768-474b-b80c-c29026c68bd6","Type":"ContainerStarted","Data":"a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747"} Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.418827 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64684dfb44-vvmcx" Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.420167 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerStarted","Data":"141cff3164c6317ad372fa00a1c8ca2c05a1b2d846b176bc76304634576f1afc"} Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.465261 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.476843 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64684dfb44-vvmcx"] Jan 30 18:50:33 crc kubenswrapper[4782]: I0130 18:50:33.526297 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:34 crc kubenswrapper[4782]: I0130 18:50:34.424756 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076239f6-df06-4f9a-a0e6-70413767a0c9" path="/var/lib/kubelet/pods/076239f6-df06-4f9a-a0e6-70413767a0c9/volumes" Jan 30 18:50:34 crc kubenswrapper[4782]: I0130 18:50:34.426637 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2418592-40ea-4fe5-91f1-134a6635f724" path="/var/lib/kubelet/pods/d2418592-40ea-4fe5-91f1-134a6635f724/volumes" Jan 30 18:50:34 crc kubenswrapper[4782]: I0130 18:50:34.443797 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerStarted","Data":"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261"} Jan 30 18:50:34 crc kubenswrapper[4782]: I0130 18:50:34.444035 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerStarted","Data":"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52"} Jan 30 18:50:34 crc kubenswrapper[4782]: I0130 18:50:34.444104 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerStarted","Data":"600343d072950175942fc27ef02b8084f5c3ab894855cd5c4397a5bb124c6fc3"} Jan 30 18:50:35 crc kubenswrapper[4782]: I0130 18:50:35.456339 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerStarted","Data":"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572"} Jan 30 18:50:35 crc kubenswrapper[4782]: I0130 18:50:35.669114 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 18:50:36 crc kubenswrapper[4782]: I0130 18:50:36.034680 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:37 crc kubenswrapper[4782]: I0130 18:50:37.887748 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:37 crc kubenswrapper[4782]: I0130 18:50:37.888040 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:37 crc kubenswrapper[4782]: I0130 18:50:37.915081 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:38 crc kubenswrapper[4782]: I0130 18:50:38.550112 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:38 crc kubenswrapper[4782]: I0130 18:50:38.588918 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:40 crc kubenswrapper[4782]: I0130 18:50:40.510687 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" containerID="cri-o://141cff3164c6317ad372fa00a1c8ca2c05a1b2d846b176bc76304634576f1afc" gracePeriod=30 Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.519775 4782 generic.go:334] "Generic (PLEG): container finished" podID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerID="141cff3164c6317ad372fa00a1c8ca2c05a1b2d846b176bc76304634576f1afc" exitCode=0 Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.520035 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerDied","Data":"141cff3164c6317ad372fa00a1c8ca2c05a1b2d846b176bc76304634576f1afc"} Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.520069 4782 scope.go:117] "RemoveContainer" containerID="8c84961fe98ddcb4c3fa386a648bab582bdbd79b278cd6a12b723b29bedef783" Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.831077 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.831416 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-log" containerID="cri-o://7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f" gracePeriod=30 Jan 30 18:50:41 crc kubenswrapper[4782]: I0130 18:50:41.831589 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-httpd" containerID="cri-o://ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc" gracePeriod=30 Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.555008 4782 generic.go:334] "Generic (PLEG): container finished" podID="f11ce800-5d86-4411-922e-af28bc822732" containerID="7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f" exitCode=143 Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.555318 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerDied","Data":"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f"} Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.567462 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.673994 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sqh6\" (UniqueName: \"kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6\") pod \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.674149 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca\") pod \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.674275 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs\") pod \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.674413 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle\") pod \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.674467 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data\") pod \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\" (UID: \"e82abe4a-d9ad-47dd-bd5c-2704052ba388\") " Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.675811 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs" (OuterVolumeSpecName: "logs") pod "e82abe4a-d9ad-47dd-bd5c-2704052ba388" (UID: "e82abe4a-d9ad-47dd-bd5c-2704052ba388"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.679840 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6" (OuterVolumeSpecName: "kube-api-access-4sqh6") pod "e82abe4a-d9ad-47dd-bd5c-2704052ba388" (UID: "e82abe4a-d9ad-47dd-bd5c-2704052ba388"). InnerVolumeSpecName "kube-api-access-4sqh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.704820 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e82abe4a-d9ad-47dd-bd5c-2704052ba388" (UID: "e82abe4a-d9ad-47dd-bd5c-2704052ba388"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.714758 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e82abe4a-d9ad-47dd-bd5c-2704052ba388" (UID: "e82abe4a-d9ad-47dd-bd5c-2704052ba388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.736060 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data" (OuterVolumeSpecName: "config-data") pod "e82abe4a-d9ad-47dd-bd5c-2704052ba388" (UID: "e82abe4a-d9ad-47dd-bd5c-2704052ba388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.777154 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sqh6\" (UniqueName: \"kubernetes.io/projected/e82abe4a-d9ad-47dd-bd5c-2704052ba388-kube-api-access-4sqh6\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.777186 4782 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.777198 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e82abe4a-d9ad-47dd-bd5c-2704052ba388-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.777207 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:42 crc kubenswrapper[4782]: I0130 18:50:42.777216 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82abe4a-d9ad-47dd-bd5c-2704052ba388-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.137624 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184420 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184483 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184502 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184652 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184684 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184727 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp2kq\" (UniqueName: \"kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184771 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.184834 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle\") pod \"f11ce800-5d86-4411-922e-af28bc822732\" (UID: \"f11ce800-5d86-4411-922e-af28bc822732\") " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.189042 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs" (OuterVolumeSpecName: "logs") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.189296 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.191950 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts" (OuterVolumeSpecName: "scripts") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.217746 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.228417 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq" (OuterVolumeSpecName: "kube-api-access-xp2kq") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "kube-api-access-xp2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.246410 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.247577 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-log" containerID="cri-o://cc17318eb81e0fab13522608b3af76a8b5e1c1b435c697df5058572259139b1c" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.247763 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-httpd" containerID="cri-o://6657418c9a7b7eccd97f606c95485ebde2d2ee5d06489b30bf910d1a6569a0a9" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.276431 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.276543 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data" (OuterVolumeSpecName: "config-data") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.294958 4782 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.294986 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.294996 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp2kq\" (UniqueName: \"kubernetes.io/projected/f11ce800-5d86-4411-922e-af28bc822732-kube-api-access-xp2kq\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.295019 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.295028 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.295037 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.295045 4782 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f11ce800-5d86-4411-922e-af28bc822732-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.296550 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11ce800-5d86-4411-922e-af28bc822732" (UID: "f11ce800-5d86-4411-922e-af28bc822732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.318342 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.396966 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.397001 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11ce800-5d86-4411-922e-af28bc822732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.564776 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.564767 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"e82abe4a-d9ad-47dd-bd5c-2704052ba388","Type":"ContainerDied","Data":"c808a9cd1f3fedf1a620ec21ee0158757cbbc20f2759739edd5b1eb25b581b48"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.564899 4782 scope.go:117] "RemoveContainer" containerID="141cff3164c6317ad372fa00a1c8ca2c05a1b2d846b176bc76304634576f1afc" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.567021 4782 generic.go:334] "Generic (PLEG): container finished" podID="f11ce800-5d86-4411-922e-af28bc822732" containerID="ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc" exitCode=0 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.567074 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerDied","Data":"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.567101 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f11ce800-5d86-4411-922e-af28bc822732","Type":"ContainerDied","Data":"faa0ae45ff10cff3b9064e290118521473924f5f36322f5355b224fd3af6afb1"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.567168 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.581267 4782 generic.go:334] "Generic (PLEG): container finished" podID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerID="cc17318eb81e0fab13522608b3af76a8b5e1c1b435c697df5058572259139b1c" exitCode=143 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.581332 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerDied","Data":"cc17318eb81e0fab13522608b3af76a8b5e1c1b435c697df5058572259139b1c"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.586950 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" event={"ID":"616ca793-5768-474b-b80c-c29026c68bd6","Type":"ContainerStarted","Data":"7a45e10cc336b44091367d42fa0e6c048a33f1597edc1ca8f69759f18364baa5"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.590816 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerStarted","Data":"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155"} Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.590980 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-central-agent" containerID="cri-o://c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.591122 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.591132 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="sg-core" containerID="cri-o://fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.591178 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="proxy-httpd" containerID="cri-o://670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.591439 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-notification-agent" containerID="cri-o://935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261" gracePeriod=30 Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.603375 4782 scope.go:117] "RemoveContainer" containerID="ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.613358 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.628294 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.642384 4782 scope.go:117] "RemoveContainer" containerID="7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.652495 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.652906 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.652922 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.652939 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.652945 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.652976 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-log" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.652983 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-log" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.652995 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-httpd" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653000 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-httpd" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653271 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653289 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-httpd" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653297 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653309 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653320 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11ce800-5d86-4411-922e-af28bc822732" containerName="glance-log" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653329 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.653419 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" podStartSLOduration=2.610932102 podStartE2EDuration="11.653406481s" podCreationTimestamp="2026-01-30 18:50:32 +0000 UTC" firstStartedPulling="2026-01-30 18:50:33.33092225 +0000 UTC m=+1209.599300275" lastFinishedPulling="2026-01-30 18:50:42.373396629 +0000 UTC m=+1218.641774654" observedRunningTime="2026-01-30 18:50:43.621885852 +0000 UTC m=+1219.890263877" watchObservedRunningTime="2026-01-30 18:50:43.653406481 +0000 UTC m=+1219.921784506" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.654019 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.661070 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.688934 4782 scope.go:117] "RemoveContainer" containerID="ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.693315 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc\": container with ID starting with ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc not found: ID does not exist" containerID="ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.693554 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc"} err="failed to get container status \"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc\": rpc error: code = NotFound desc = could not find container \"ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc\": container with ID starting with ac4e20f898344083cb835dfe707a9f9a074f01175fb6fe9b286d5ae61a5aebcc not found: ID does not exist" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.693697 4782 scope.go:117] "RemoveContainer" containerID="7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.695207 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f\": container with ID starting with 7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f not found: ID does not exist" containerID="7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.695369 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f"} err="failed to get container status \"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f\": rpc error: code = NotFound desc = could not find container \"7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f\": container with ID starting with 7b409a5524e46254ee617f415cac5897f84b620ade4d4a65512114543197ec9f not found: ID does not exist" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.702753 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4ctd\" (UniqueName: \"kubernetes.io/projected/25e52062-f76c-4ebf-9738-8e5a9990aba9-kube-api-access-t4ctd\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.703025 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.703212 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e52062-f76c-4ebf-9738-8e5a9990aba9-logs\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.703333 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.703440 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.727632 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8611655799999998 podStartE2EDuration="11.727609096s" podCreationTimestamp="2026-01-30 18:50:32 +0000 UTC" firstStartedPulling="2026-01-30 18:50:33.506988604 +0000 UTC m=+1209.775366629" lastFinishedPulling="2026-01-30 18:50:42.37343212 +0000 UTC m=+1218.641810145" observedRunningTime="2026-01-30 18:50:43.661359858 +0000 UTC m=+1219.929737883" watchObservedRunningTime="2026-01-30 18:50:43.727609096 +0000 UTC m=+1219.995987121" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.730482 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.769299 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.786151 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.806475 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e52062-f76c-4ebf-9738-8e5a9990aba9-logs\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.806521 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.806542 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.806666 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4ctd\" (UniqueName: \"kubernetes.io/projected/25e52062-f76c-4ebf-9738-8e5a9990aba9-kube-api-access-t4ctd\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.806686 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.809610 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25e52062-f76c-4ebf-9738-8e5a9990aba9-logs\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.814889 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.816860 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.820591 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25e52062-f76c-4ebf-9738-8e5a9990aba9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.837872 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4ctd\" (UniqueName: \"kubernetes.io/projected/25e52062-f76c-4ebf-9738-8e5a9990aba9-kube-api-access-t4ctd\") pod \"watcher-decision-engine-0\" (UID: \"25e52062-f76c-4ebf-9738-8e5a9990aba9\") " pod="openstack/watcher-decision-engine-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.852598 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.853043 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.853056 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: E0130 18:50:43.853070 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.853076 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" containerName="watcher-decision-engine" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.854205 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.859682 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.860461 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.886518 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:43 crc kubenswrapper[4782]: I0130 18:50:43.985986 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013582 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013631 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013667 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013701 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013719 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013749 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013766 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.013799 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svn4b\" (UniqueName: \"kubernetes.io/projected/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-kube-api-access-svn4b\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115480 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115738 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115777 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115796 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115830 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svn4b\" (UniqueName: \"kubernetes.io/projected/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-kube-api-access-svn4b\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115916 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115947 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.115974 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.116867 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-logs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.116912 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.117077 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.125658 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.126875 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.128340 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.129351 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.137058 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svn4b\" (UniqueName: \"kubernetes.io/projected/fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397-kube-api-access-svn4b\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.158143 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397\") " pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.220297 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.433078 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82abe4a-d9ad-47dd-bd5c-2704052ba388" path="/var/lib/kubelet/pods/e82abe4a-d9ad-47dd-bd5c-2704052ba388/volumes" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.433841 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11ce800-5d86-4411-922e-af28bc822732" path="/var/lib/kubelet/pods/f11ce800-5d86-4411-922e-af28bc822732/volumes" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.561435 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.656491 4782 generic.go:334] "Generic (PLEG): container finished" podID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerID="6657418c9a7b7eccd97f606c95485ebde2d2ee5d06489b30bf910d1a6569a0a9" exitCode=0 Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.656557 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerDied","Data":"6657418c9a7b7eccd97f606c95485ebde2d2ee5d06489b30bf910d1a6569a0a9"} Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.670525 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"25e52062-f76c-4ebf-9738-8e5a9990aba9","Type":"ContainerStarted","Data":"ecf62fd675252b2fe3f56434b2da765e6b7fe5e28b5e99a37cb2b4c9ea5f7981"} Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.701461 4782 generic.go:334] "Generic (PLEG): container finished" podID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerID="670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155" exitCode=0 Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.701494 4782 generic.go:334] "Generic (PLEG): container finished" podID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerID="fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572" exitCode=2 Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.702255 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerDied","Data":"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155"} Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.702283 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerDied","Data":"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572"} Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.785542 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.894680 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951245 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951299 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr8jm\" (UniqueName: \"kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951327 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951388 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951542 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951570 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951607 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.951621 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data\") pod \"3bb19d14-c078-4fee-81ab-6246f1f56059\" (UID: \"3bb19d14-c078-4fee-81ab-6246f1f56059\") " Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.952267 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs" (OuterVolumeSpecName: "logs") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.952602 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.982246 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.982407 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm" (OuterVolumeSpecName: "kube-api-access-lr8jm") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "kube-api-access-lr8jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:44 crc kubenswrapper[4782]: I0130 18:50:44.985366 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts" (OuterVolumeSpecName: "scripts") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.010965 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.014701 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data" (OuterVolumeSpecName: "config-data") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.023791 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bb19d14-c078-4fee-81ab-6246f1f56059" (UID: "3bb19d14-c078-4fee-81ab-6246f1f56059"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053817 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053852 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053864 4782 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053876 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053885 4782 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3bb19d14-c078-4fee-81ab-6246f1f56059-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053893 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr8jm\" (UniqueName: \"kubernetes.io/projected/3bb19d14-c078-4fee-81ab-6246f1f56059-kube-api-access-lr8jm\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053903 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bb19d14-c078-4fee-81ab-6246f1f56059-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.053935 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.071895 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.155173 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.712942 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3bb19d14-c078-4fee-81ab-6246f1f56059","Type":"ContainerDied","Data":"78354cbfeccc9b5b4c43cf77cf16ebca46fa797416af14add9382d477d674563"} Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.713360 4782 scope.go:117] "RemoveContainer" containerID="6657418c9a7b7eccd97f606c95485ebde2d2ee5d06489b30bf910d1a6569a0a9" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.712984 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.714975 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"25e52062-f76c-4ebf-9738-8e5a9990aba9","Type":"ContainerStarted","Data":"5964defdfa45d17fdfc278443fd56e5ab0fd5d0f2905d46a050c910597452f5d"} Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.718896 4782 generic.go:334] "Generic (PLEG): container finished" podID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerID="935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261" exitCode=0 Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.718947 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerDied","Data":"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261"} Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.722112 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397","Type":"ContainerStarted","Data":"c81cf0db46ad2e335b8f24adf799c5a37232726bf90c5d0485e57603d7851eb9"} Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.722148 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397","Type":"ContainerStarted","Data":"9717aada99573f7226f8979e326ae0906627f845b34fb9cc93b9aa82be000dd8"} Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.734448 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.73442588 podStartE2EDuration="2.73442588s" podCreationTimestamp="2026-01-30 18:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:45.734330448 +0000 UTC m=+1222.002708473" watchObservedRunningTime="2026-01-30 18:50:45.73442588 +0000 UTC m=+1222.002803915" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.750669 4782 scope.go:117] "RemoveContainer" containerID="cc17318eb81e0fab13522608b3af76a8b5e1c1b435c697df5058572259139b1c" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.762522 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.776255 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.800487 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:45 crc kubenswrapper[4782]: E0130 18:50:45.800897 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-log" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.800913 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-log" Jan 30 18:50:45 crc kubenswrapper[4782]: E0130 18:50:45.800923 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-httpd" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.800930 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-httpd" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.801145 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-log" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.801157 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" containerName="glance-httpd" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.802206 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.805362 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.805378 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.828006 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974175 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974593 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8tm5\" (UniqueName: \"kubernetes.io/projected/974e57d1-5346-4863-a1e3-1b595eaa91b5-kube-api-access-w8tm5\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974616 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974666 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974694 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974735 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974768 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:45 crc kubenswrapper[4782]: I0130 18:50:45.974820 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077271 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8tm5\" (UniqueName: \"kubernetes.io/projected/974e57d1-5346-4863-a1e3-1b595eaa91b5-kube-api-access-w8tm5\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077443 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077532 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077627 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077703 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077871 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077968 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.078088 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.078568 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.077969 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.078190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/974e57d1-5346-4863-a1e3-1b595eaa91b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.082177 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.088300 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.088899 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.098672 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8tm5\" (UniqueName: \"kubernetes.io/projected/974e57d1-5346-4863-a1e3-1b595eaa91b5-kube-api-access-w8tm5\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.103939 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/974e57d1-5346-4863-a1e3-1b595eaa91b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.129627 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"974e57d1-5346-4863-a1e3-1b595eaa91b5\") " pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.433884 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.434646 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb19d14-c078-4fee-81ab-6246f1f56059" path="/var/lib/kubelet/pods/3bb19d14-c078-4fee-81ab-6246f1f56059/volumes" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.537867 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693125 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693206 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr97x\" (UniqueName: \"kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693278 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693304 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693330 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693390 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693424 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.693448 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs\") pod \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\" (UID: \"6d8b737e-7a4f-4831-810b-2f8eadc6712d\") " Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.696300 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.696531 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.703403 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts" (OuterVolumeSpecName: "scripts") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.713796 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x" (OuterVolumeSpecName: "kube-api-access-hr97x") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "kube-api-access-hr97x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.735821 4782 generic.go:334] "Generic (PLEG): container finished" podID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerID="c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52" exitCode=0 Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.735900 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerDied","Data":"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52"} Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.735952 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8b737e-7a4f-4831-810b-2f8eadc6712d","Type":"ContainerDied","Data":"600343d072950175942fc27ef02b8084f5c3ab894855cd5c4397a5bb124c6fc3"} Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.735973 4782 scope.go:117] "RemoveContainer" containerID="670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.736222 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.749377 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397","Type":"ContainerStarted","Data":"d47714ea6b0cf8dc574373b36671bffa24276608c37f56c4db678d8a0f1955ed"} Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.757222 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.770335 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.772853 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.772834998 podStartE2EDuration="3.772834998s" podCreationTimestamp="2026-01-30 18:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:46.76768373 +0000 UTC m=+1223.036061755" watchObservedRunningTime="2026-01-30 18:50:46.772834998 +0000 UTC m=+1223.041213023" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.782504 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798129 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798162 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798172 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798181 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798191 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798199 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8b737e-7a4f-4831-810b-2f8eadc6712d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.798210 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr97x\" (UniqueName: \"kubernetes.io/projected/6d8b737e-7a4f-4831-810b-2f8eadc6712d-kube-api-access-hr97x\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.830484 4782 scope.go:117] "RemoveContainer" containerID="fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.858027 4782 scope.go:117] "RemoveContainer" containerID="935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.867271 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data" (OuterVolumeSpecName: "config-data") pod "6d8b737e-7a4f-4831-810b-2f8eadc6712d" (UID: "6d8b737e-7a4f-4831-810b-2f8eadc6712d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.880825 4782 scope.go:117] "RemoveContainer" containerID="c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.900325 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8b737e-7a4f-4831-810b-2f8eadc6712d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.906110 4782 scope.go:117] "RemoveContainer" containerID="670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155" Jan 30 18:50:46 crc kubenswrapper[4782]: E0130 18:50:46.911498 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155\": container with ID starting with 670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155 not found: ID does not exist" containerID="670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.911563 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155"} err="failed to get container status \"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155\": rpc error: code = NotFound desc = could not find container \"670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155\": container with ID starting with 670e9bbda1fcf5b59f4ff892abead21446c1850cc476c599209da52885952155 not found: ID does not exist" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.911598 4782 scope.go:117] "RemoveContainer" containerID="fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572" Jan 30 18:50:46 crc kubenswrapper[4782]: E0130 18:50:46.913854 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572\": container with ID starting with fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572 not found: ID does not exist" containerID="fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.913890 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572"} err="failed to get container status \"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572\": rpc error: code = NotFound desc = could not find container \"fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572\": container with ID starting with fc65d438df94892d121b3b0482138ff3e1cb804e943640b155661ebb1ecf3572 not found: ID does not exist" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.913914 4782 scope.go:117] "RemoveContainer" containerID="935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261" Jan 30 18:50:46 crc kubenswrapper[4782]: E0130 18:50:46.914484 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261\": container with ID starting with 935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261 not found: ID does not exist" containerID="935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.914508 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261"} err="failed to get container status \"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261\": rpc error: code = NotFound desc = could not find container \"935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261\": container with ID starting with 935cbe8b85a28ee54148c5e4b691ef3d6afee77ee33af8286b21a6097b805261 not found: ID does not exist" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.914528 4782 scope.go:117] "RemoveContainer" containerID="c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52" Jan 30 18:50:46 crc kubenswrapper[4782]: E0130 18:50:46.914791 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52\": container with ID starting with c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52 not found: ID does not exist" containerID="c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52" Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.914815 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52"} err="failed to get container status \"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52\": rpc error: code = NotFound desc = could not find container \"c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52\": container with ID starting with c5bafd167b9176ff0a3991b34d74a502d93d130b60f3e5dd263caee4916c5f52 not found: ID does not exist" Jan 30 18:50:46 crc kubenswrapper[4782]: W0130 18:50:46.968810 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974e57d1_5346_4863_a1e3_1b595eaa91b5.slice/crio-3754fbcc785b494e7077fa9e4d24e827eb6a7333635673750f8652c80c4ff2c4 WatchSource:0}: Error finding container 3754fbcc785b494e7077fa9e4d24e827eb6a7333635673750f8652c80c4ff2c4: Status 404 returned error can't find the container with id 3754fbcc785b494e7077fa9e4d24e827eb6a7333635673750f8652c80c4ff2c4 Jan 30 18:50:46 crc kubenswrapper[4782]: I0130 18:50:46.969794 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.084647 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.103248 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.115559 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:47 crc kubenswrapper[4782]: E0130 18:50:47.116114 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="sg-core" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116132 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="sg-core" Jan 30 18:50:47 crc kubenswrapper[4782]: E0130 18:50:47.116150 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="proxy-httpd" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116158 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="proxy-httpd" Jan 30 18:50:47 crc kubenswrapper[4782]: E0130 18:50:47.116173 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-notification-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116181 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-notification-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: E0130 18:50:47.116197 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-central-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116204 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-central-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116506 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-notification-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116524 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="ceilometer-central-agent" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116539 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="proxy-httpd" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.116551 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" containerName="sg-core" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.118751 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.121569 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.121784 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.123976 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.129951 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.307722 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.307964 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.307985 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.308008 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.308037 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.308206 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.308280 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.308367 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410434 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410481 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410503 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410528 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410562 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410608 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410627 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.410692 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.411360 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.411412 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.415507 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.415992 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.416007 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.416049 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.426378 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.428624 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh\") pod \"ceilometer-0\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.442711 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.757220 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"974e57d1-5346-4863-a1e3-1b595eaa91b5","Type":"ContainerStarted","Data":"a08ef69943067e412263280a75c01888d96666a1c8028ae5dd66385f76f9c7f8"} Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.757486 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"974e57d1-5346-4863-a1e3-1b595eaa91b5","Type":"ContainerStarted","Data":"3754fbcc785b494e7077fa9e4d24e827eb6a7333635673750f8652c80c4ff2c4"} Jan 30 18:50:47 crc kubenswrapper[4782]: I0130 18:50:47.908130 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.420878 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8b737e-7a4f-4831-810b-2f8eadc6712d" path="/var/lib/kubelet/pods/6d8b737e-7a4f-4831-810b-2f8eadc6712d/volumes" Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.772610 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"974e57d1-5346-4863-a1e3-1b595eaa91b5","Type":"ContainerStarted","Data":"76cce392efa56137274f3b009758140af36cffd334d4f3c0ed58fa6782daf7cb"} Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.774918 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerStarted","Data":"02d817910bd0ccae7591b24967cbf19ec861501ed5de1f66e2d1187b6b807213"} Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.774971 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerStarted","Data":"a8254dc2dddbfe8d32fd274d1b80c4b240b1ce1b1ee5b1f596546f0c1d60dff6"} Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.774990 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerStarted","Data":"6b61b43dd6de415703a6cbb805d324b820341f48ebb7cda456e35e21aea8dbb9"} Jan 30 18:50:48 crc kubenswrapper[4782]: I0130 18:50:48.801961 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.801941632 podStartE2EDuration="3.801941632s" podCreationTimestamp="2026-01-30 18:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:50:48.795147184 +0000 UTC m=+1225.063525229" watchObservedRunningTime="2026-01-30 18:50:48.801941632 +0000 UTC m=+1225.070319667" Jan 30 18:50:49 crc kubenswrapper[4782]: I0130 18:50:49.787782 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerStarted","Data":"5566e832b97370022a9829904247b94731eaed56e61006c31608b447fad558c4"} Jan 30 18:50:49 crc kubenswrapper[4782]: I0130 18:50:49.792913 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:50:49 crc kubenswrapper[4782]: I0130 18:50:49.792962 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:50:51 crc kubenswrapper[4782]: I0130 18:50:51.410244 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b357c566-0063-4e60-b284-9d4e8911734d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.179:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:50:52 crc kubenswrapper[4782]: I0130 18:50:52.818341 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerStarted","Data":"50a3c3bbcfe980c163749d2c8bebbf8b4c0e5ee59e00519a95b63f917d00dabd"} Jan 30 18:50:52 crc kubenswrapper[4782]: I0130 18:50:52.818861 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:50:52 crc kubenswrapper[4782]: I0130 18:50:52.854849 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.217394826 podStartE2EDuration="5.854824471s" podCreationTimestamp="2026-01-30 18:50:47 +0000 UTC" firstStartedPulling="2026-01-30 18:50:47.923758418 +0000 UTC m=+1224.192136463" lastFinishedPulling="2026-01-30 18:50:51.561188083 +0000 UTC m=+1227.829566108" observedRunningTime="2026-01-30 18:50:52.848094245 +0000 UTC m=+1229.116472300" watchObservedRunningTime="2026-01-30 18:50:52.854824471 +0000 UTC m=+1229.123202516" Jan 30 18:50:53 crc kubenswrapper[4782]: I0130 18:50:53.986621 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.040815 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.221299 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.221352 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.265780 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.311367 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.843398 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.843690 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.843702 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 18:50:54 crc kubenswrapper[4782]: I0130 18:50:54.878927 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.435520 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.435763 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.464741 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.483141 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.656025 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.694628 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.861111 4782 generic.go:334] "Generic (PLEG): container finished" podID="616ca793-5768-474b-b80c-c29026c68bd6" containerID="7a45e10cc336b44091367d42fa0e6c048a33f1597edc1ca8f69759f18364baa5" exitCode=0 Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.861209 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" event={"ID":"616ca793-5768-474b-b80c-c29026c68bd6","Type":"ContainerDied","Data":"7a45e10cc336b44091367d42fa0e6c048a33f1597edc1ca8f69759f18364baa5"} Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.861901 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:56 crc kubenswrapper[4782]: I0130 18:50:56.861931 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.300900 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.442132 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle\") pod \"616ca793-5768-474b-b80c-c29026c68bd6\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.442295 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq8ww\" (UniqueName: \"kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww\") pod \"616ca793-5768-474b-b80c-c29026c68bd6\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.442390 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts\") pod \"616ca793-5768-474b-b80c-c29026c68bd6\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.442496 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data\") pod \"616ca793-5768-474b-b80c-c29026c68bd6\" (UID: \"616ca793-5768-474b-b80c-c29026c68bd6\") " Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.447722 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww" (OuterVolumeSpecName: "kube-api-access-sq8ww") pod "616ca793-5768-474b-b80c-c29026c68bd6" (UID: "616ca793-5768-474b-b80c-c29026c68bd6"). InnerVolumeSpecName "kube-api-access-sq8ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.449962 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts" (OuterVolumeSpecName: "scripts") pod "616ca793-5768-474b-b80c-c29026c68bd6" (UID: "616ca793-5768-474b-b80c-c29026c68bd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.470276 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data" (OuterVolumeSpecName: "config-data") pod "616ca793-5768-474b-b80c-c29026c68bd6" (UID: "616ca793-5768-474b-b80c-c29026c68bd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.486363 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "616ca793-5768-474b-b80c-c29026c68bd6" (UID: "616ca793-5768-474b-b80c-c29026c68bd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.545007 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq8ww\" (UniqueName: \"kubernetes.io/projected/616ca793-5768-474b-b80c-c29026c68bd6-kube-api-access-sq8ww\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.545036 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.545045 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.545054 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616ca793-5768-474b-b80c-c29026c68bd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.617050 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.683261 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.884466 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.884651 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wsq8m" event={"ID":"616ca793-5768-474b-b80c-c29026c68bd6","Type":"ContainerDied","Data":"a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747"} Jan 30 18:50:58 crc kubenswrapper[4782]: I0130 18:50:58.884958 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a40ab029cac7d2a1ae12d1ba3affdde28e68424078e63963d88cc9e35efd0747" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.027922 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:50:59 crc kubenswrapper[4782]: E0130 18:50:59.028410 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616ca793-5768-474b-b80c-c29026c68bd6" containerName="nova-cell0-conductor-db-sync" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.028431 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="616ca793-5768-474b-b80c-c29026c68bd6" containerName="nova-cell0-conductor-db-sync" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.028676 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="616ca793-5768-474b-b80c-c29026c68bd6" containerName="nova-cell0-conductor-db-sync" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.029415 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.031663 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.042767 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jdpv6" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.047091 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.175825 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dwv\" (UniqueName: \"kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.175946 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.176062 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.277782 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.277903 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dwv\" (UniqueName: \"kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.278343 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.282919 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.286548 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.297718 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dwv\" (UniqueName: \"kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv\") pod \"nova-cell0-conductor-0\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.357952 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:50:59 crc kubenswrapper[4782]: I0130 18:50:59.918714 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:00 crc kubenswrapper[4782]: I0130 18:51:00.906441 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c2ea0c45-ec0b-4194-94f8-34458c85b84d","Type":"ContainerStarted","Data":"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df"} Jan 30 18:51:00 crc kubenswrapper[4782]: I0130 18:51:00.906768 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c2ea0c45-ec0b-4194-94f8-34458c85b84d","Type":"ContainerStarted","Data":"3959384108ee5a1d01b56f6cdd17944e92948b655227e7a53235405665f31abb"} Jan 30 18:51:00 crc kubenswrapper[4782]: I0130 18:51:00.906917 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:00 crc kubenswrapper[4782]: I0130 18:51:00.928402 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9283846599999999 podStartE2EDuration="1.92838466s" podCreationTimestamp="2026-01-30 18:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:00.921854358 +0000 UTC m=+1237.190232383" watchObservedRunningTime="2026-01-30 18:51:00.92838466 +0000 UTC m=+1237.196762685" Jan 30 18:51:01 crc kubenswrapper[4782]: I0130 18:51:01.834394 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:02 crc kubenswrapper[4782]: I0130 18:51:02.949217 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df" gracePeriod=30 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.655567 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.656531 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="sg-core" containerID="cri-o://5566e832b97370022a9829904247b94731eaed56e61006c31608b447fad558c4" gracePeriod=30 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.656531 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-notification-agent" containerID="cri-o://02d817910bd0ccae7591b24967cbf19ec861501ed5de1f66e2d1187b6b807213" gracePeriod=30 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.656543 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="proxy-httpd" containerID="cri-o://50a3c3bbcfe980c163749d2c8bebbf8b4c0e5ee59e00519a95b63f917d00dabd" gracePeriod=30 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.656706 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-central-agent" containerID="cri-o://a8254dc2dddbfe8d32fd274d1b80c4b240b1ce1b1ee5b1f596546f0c1d60dff6" gracePeriod=30 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.682869 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.867705 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.965342 4782 generic.go:334] "Generic (PLEG): container finished" podID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" containerID="cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df" exitCode=0 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.965394 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c2ea0c45-ec0b-4194-94f8-34458c85b84d","Type":"ContainerDied","Data":"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df"} Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.965421 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c2ea0c45-ec0b-4194-94f8-34458c85b84d","Type":"ContainerDied","Data":"3959384108ee5a1d01b56f6cdd17944e92948b655227e7a53235405665f31abb"} Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.965437 4782 scope.go:117] "RemoveContainer" containerID="cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df" Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.965535 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.969798 4782 generic.go:334] "Generic (PLEG): container finished" podID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerID="50a3c3bbcfe980c163749d2c8bebbf8b4c0e5ee59e00519a95b63f917d00dabd" exitCode=0 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.969810 4782 generic.go:334] "Generic (PLEG): container finished" podID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerID="5566e832b97370022a9829904247b94731eaed56e61006c31608b447fad558c4" exitCode=2 Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.969822 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerDied","Data":"50a3c3bbcfe980c163749d2c8bebbf8b4c0e5ee59e00519a95b63f917d00dabd"} Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.969837 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerDied","Data":"5566e832b97370022a9829904247b94731eaed56e61006c31608b447fad558c4"} Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.976265 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5dwv\" (UniqueName: \"kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv\") pod \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.976363 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data\") pod \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " Jan 30 18:51:03 crc kubenswrapper[4782]: I0130 18:51:03.976739 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle\") pod \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\" (UID: \"c2ea0c45-ec0b-4194-94f8-34458c85b84d\") " Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:03.999985 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv" (OuterVolumeSpecName: "kube-api-access-t5dwv") pod "c2ea0c45-ec0b-4194-94f8-34458c85b84d" (UID: "c2ea0c45-ec0b-4194-94f8-34458c85b84d"). InnerVolumeSpecName "kube-api-access-t5dwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.007957 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2ea0c45-ec0b-4194-94f8-34458c85b84d" (UID: "c2ea0c45-ec0b-4194-94f8-34458c85b84d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.033211 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data" (OuterVolumeSpecName: "config-data") pod "c2ea0c45-ec0b-4194-94f8-34458c85b84d" (UID: "c2ea0c45-ec0b-4194-94f8-34458c85b84d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.074119 4782 scope.go:117] "RemoveContainer" containerID="cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df" Jan 30 18:51:04 crc kubenswrapper[4782]: E0130 18:51:04.074657 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df\": container with ID starting with cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df not found: ID does not exist" containerID="cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.074800 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df"} err="failed to get container status \"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df\": rpc error: code = NotFound desc = could not find container \"cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df\": container with ID starting with cb6616f23f30604659dcf526025b4959979ab89785f4ba63dd6c43e3d92be6df not found: ID does not exist" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.079281 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5dwv\" (UniqueName: \"kubernetes.io/projected/c2ea0c45-ec0b-4194-94f8-34458c85b84d-kube-api-access-t5dwv\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.079397 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.079475 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ea0c45-ec0b-4194-94f8-34458c85b84d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.301880 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.329037 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.350191 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:04 crc kubenswrapper[4782]: E0130 18:51:04.350910 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" containerName="nova-cell0-conductor-conductor" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.350947 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" containerName="nova-cell0-conductor-conductor" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.351309 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" containerName="nova-cell0-conductor-conductor" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.352404 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.354517 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.354826 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jdpv6" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.362679 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.424098 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ea0c45-ec0b-4194-94f8-34458c85b84d" path="/var/lib/kubelet/pods/c2ea0c45-ec0b-4194-94f8-34458c85b84d/volumes" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.491011 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bpr\" (UniqueName: \"kubernetes.io/projected/860e5849-ad0b-4f89-87db-b839441f0dd9-kube-api-access-q8bpr\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.491531 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.491612 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.593715 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.593794 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.593902 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bpr\" (UniqueName: \"kubernetes.io/projected/860e5849-ad0b-4f89-87db-b839441f0dd9-kube-api-access-q8bpr\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.599100 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.600006 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/860e5849-ad0b-4f89-87db-b839441f0dd9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.625442 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bpr\" (UniqueName: \"kubernetes.io/projected/860e5849-ad0b-4f89-87db-b839441f0dd9-kube-api-access-q8bpr\") pod \"nova-cell0-conductor-0\" (UID: \"860e5849-ad0b-4f89-87db-b839441f0dd9\") " pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.668846 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.982973 4782 generic.go:334] "Generic (PLEG): container finished" podID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerID="a8254dc2dddbfe8d32fd274d1b80c4b240b1ce1b1ee5b1f596546f0c1d60dff6" exitCode=0 Jan 30 18:51:04 crc kubenswrapper[4782]: I0130 18:51:04.983205 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerDied","Data":"a8254dc2dddbfe8d32fd274d1b80c4b240b1ce1b1ee5b1f596546f0c1d60dff6"} Jan 30 18:51:05 crc kubenswrapper[4782]: I0130 18:51:05.181734 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 18:51:06 crc kubenswrapper[4782]: I0130 18:51:06.011977 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"860e5849-ad0b-4f89-87db-b839441f0dd9","Type":"ContainerStarted","Data":"db207dae90522a02e6bd664a102a4bce22f9ec65a3240edb686e920162985f7b"} Jan 30 18:51:06 crc kubenswrapper[4782]: I0130 18:51:06.012380 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"860e5849-ad0b-4f89-87db-b839441f0dd9","Type":"ContainerStarted","Data":"60342f84c0dab61de44f891ec24ef668d1358be9c9484595aa074312f42c0f14"} Jan 30 18:51:06 crc kubenswrapper[4782]: I0130 18:51:06.012414 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:06 crc kubenswrapper[4782]: I0130 18:51:06.043962 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.043938707 podStartE2EDuration="2.043938707s" podCreationTimestamp="2026-01-30 18:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:06.033167351 +0000 UTC m=+1242.301545406" watchObservedRunningTime="2026-01-30 18:51:06.043938707 +0000 UTC m=+1242.312316772" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.048284 4782 generic.go:334] "Generic (PLEG): container finished" podID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerID="02d817910bd0ccae7591b24967cbf19ec861501ed5de1f66e2d1187b6b807213" exitCode=0 Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.048367 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerDied","Data":"02d817910bd0ccae7591b24967cbf19ec861501ed5de1f66e2d1187b6b807213"} Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.316945 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391165 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391305 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391342 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391389 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391444 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391582 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391720 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.391751 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs\") pod \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\" (UID: \"9d19aa19-3ca4-4145-b76f-c932ad59fdfe\") " Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.392088 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.392107 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.392381 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.392401 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.402126 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts" (OuterVolumeSpecName: "scripts") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.402223 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh" (OuterVolumeSpecName: "kube-api-access-7srzh") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "kube-api-access-7srzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.455569 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.468652 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.484465 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.494437 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.495219 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-kube-api-access-7srzh\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.495479 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.495682 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.495856 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.536912 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data" (OuterVolumeSpecName: "config-data") pod "9d19aa19-3ca4-4145-b76f-c932ad59fdfe" (UID: "9d19aa19-3ca4-4145-b76f-c932ad59fdfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:09 crc kubenswrapper[4782]: I0130 18:51:09.598387 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d19aa19-3ca4-4145-b76f-c932ad59fdfe-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.066907 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9d19aa19-3ca4-4145-b76f-c932ad59fdfe","Type":"ContainerDied","Data":"6b61b43dd6de415703a6cbb805d324b820341f48ebb7cda456e35e21aea8dbb9"} Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.066983 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.067407 4782 scope.go:117] "RemoveContainer" containerID="50a3c3bbcfe980c163749d2c8bebbf8b4c0e5ee59e00519a95b63f917d00dabd" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.109722 4782 scope.go:117] "RemoveContainer" containerID="5566e832b97370022a9829904247b94731eaed56e61006c31608b447fad558c4" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.129966 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.140985 4782 scope.go:117] "RemoveContainer" containerID="02d817910bd0ccae7591b24967cbf19ec861501ed5de1f66e2d1187b6b807213" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.145676 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.160481 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:10 crc kubenswrapper[4782]: E0130 18:51:10.165304 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-central-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165441 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-central-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: E0130 18:51:10.165459 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="sg-core" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165466 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="sg-core" Jan 30 18:51:10 crc kubenswrapper[4782]: E0130 18:51:10.165504 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-notification-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165512 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-notification-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: E0130 18:51:10.165524 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="proxy-httpd" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165529 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="proxy-httpd" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165700 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-notification-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165718 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="proxy-httpd" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165730 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="sg-core" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.165748 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" containerName="ceilometer-central-agent" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.168346 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.178676 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.179032 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.184765 4782 scope.go:117] "RemoveContainer" containerID="a8254dc2dddbfe8d32fd274d1b80c4b240b1ce1b1ee5b1f596546f0c1d60dff6" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.186108 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.189835 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215075 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215155 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215185 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn6mq\" (UniqueName: \"kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215277 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215339 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215476 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215532 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.215562 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317319 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317378 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317438 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317489 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317576 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn6mq\" (UniqueName: \"kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317643 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317752 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.317789 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.318900 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.319618 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.323066 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.323859 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.324490 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.326054 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.335051 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.340124 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn6mq\" (UniqueName: \"kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq\") pod \"ceilometer-0\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " pod="openstack/ceilometer-0" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.439186 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d19aa19-3ca4-4145-b76f-c932ad59fdfe" path="/var/lib/kubelet/pods/9d19aa19-3ca4-4145-b76f-c932ad59fdfe/volumes" Jan 30 18:51:10 crc kubenswrapper[4782]: I0130 18:51:10.518866 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:51:11 crc kubenswrapper[4782]: I0130 18:51:11.044397 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:51:11 crc kubenswrapper[4782]: I0130 18:51:11.085100 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerStarted","Data":"33fe1610cc458d96ea24ea204aa6c8a486d7d69121cd5c70930b0af5d112b3da"} Jan 30 18:51:12 crc kubenswrapper[4782]: I0130 18:51:12.098888 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerStarted","Data":"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12"} Jan 30 18:51:12 crc kubenswrapper[4782]: I0130 18:51:12.099541 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerStarted","Data":"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a"} Jan 30 18:51:13 crc kubenswrapper[4782]: I0130 18:51:13.112298 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerStarted","Data":"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b"} Jan 30 18:51:14 crc kubenswrapper[4782]: I0130 18:51:14.708500 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.146801 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerStarted","Data":"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac"} Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.147046 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.181989 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.933873406 podStartE2EDuration="5.181963852s" podCreationTimestamp="2026-01-30 18:51:10 +0000 UTC" firstStartedPulling="2026-01-30 18:51:11.041843586 +0000 UTC m=+1247.310221611" lastFinishedPulling="2026-01-30 18:51:14.289934022 +0000 UTC m=+1250.558312057" observedRunningTime="2026-01-30 18:51:15.173955364 +0000 UTC m=+1251.442333399" watchObservedRunningTime="2026-01-30 18:51:15.181963852 +0000 UTC m=+1251.450341897" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.291331 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xjjp9"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.292770 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.296761 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.298945 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.305936 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xjjp9"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.337166 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r4lq\" (UniqueName: \"kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.337262 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.337299 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.337722 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.439177 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.440039 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.440152 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r4lq\" (UniqueName: \"kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.440200 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.440250 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.441920 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.446647 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.448283 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.457560 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.471495 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.479717 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r4lq\" (UniqueName: \"kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq\") pod \"nova-cell0-cell-mapping-xjjp9\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.479782 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.540507 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.541746 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8nmt\" (UniqueName: \"kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.541892 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.541975 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.542242 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.545404 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.556687 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.586011 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.590313 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.604403 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.615070 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.637702 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644342 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8nmt\" (UniqueName: \"kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644671 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644696 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644718 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644741 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644768 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644790 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khr5n\" (UniqueName: \"kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644818 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644833 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644870 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.644886 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmnlf\" (UniqueName: \"kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.655427 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.656689 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.664083 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.668429 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.668493 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.670173 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.676016 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8nmt\" (UniqueName: \"kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.703288 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.706574 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747025 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khr5n\" (UniqueName: \"kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747092 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747142 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747169 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747200 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747334 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747358 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmnlf\" (UniqueName: \"kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747399 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747436 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747464 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747512 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77k2\" (UniqueName: \"kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747536 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747577 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747610 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhrgv\" (UniqueName: \"kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747675 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747707 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.747748 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.748026 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.748955 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.751737 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.752366 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.755121 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.758307 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.772549 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.779862 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khr5n\" (UniqueName: \"kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n\") pod \"nova-metadata-0\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " pod="openstack/nova-metadata-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.782787 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmnlf\" (UniqueName: \"kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf\") pod \"nova-api-0\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850254 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850296 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850335 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77k2\" (UniqueName: \"kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850350 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850379 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850401 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhrgv\" (UniqueName: \"kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850476 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850508 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.850546 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.852507 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.852516 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.853004 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.853933 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.854084 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.854477 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.859342 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.866971 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.875186 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.883285 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77k2\" (UniqueName: \"kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2\") pod \"dnsmasq-dns-db5dc4879-sfqkw\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.888369 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhrgv\" (UniqueName: \"kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv\") pod \"nova-cell1-novncproxy-0\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:15 crc kubenswrapper[4782]: I0130 18:51:15.939723 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.163937 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.184315 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.271969 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xjjp9"] Jan 30 18:51:16 crc kubenswrapper[4782]: W0130 18:51:16.436039 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a9720e_3e90_4f30_b251_5f63cb746fe3.slice/crio-05cb6f83785a91d1bad8b609d13bfd045b019f1eaf0be8e112c9b345a32f3293 WatchSource:0}: Error finding container 05cb6f83785a91d1bad8b609d13bfd045b019f1eaf0be8e112c9b345a32f3293: Status 404 returned error can't find the container with id 05cb6f83785a91d1bad8b609d13bfd045b019f1eaf0be8e112c9b345a32f3293 Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.493004 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.574896 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8ht9p"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.581598 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.598957 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.599468 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.599632 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.635877 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8ht9p"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.659501 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.710241 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.710291 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.710343 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlk6c\" (UniqueName: \"kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.710447 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.812893 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.813018 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.813043 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.813105 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlk6c\" (UniqueName: \"kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.819004 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.819601 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.821209 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.821668 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.846824 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlk6c\" (UniqueName: \"kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c\") pod \"nova-cell1-conductor-db-sync-8ht9p\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.849543 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:16 crc kubenswrapper[4782]: I0130 18:51:16.943507 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.185454 4782 generic.go:334] "Generic (PLEG): container finished" podID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerID="b5cfd805d47e6b6abbb85057b587f098120a4605886cfc397f56738b83d13023" exitCode=0 Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.185504 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" event={"ID":"30715f55-899e-47c8-a6f2-284ce89e38fa","Type":"ContainerDied","Data":"b5cfd805d47e6b6abbb85057b587f098120a4605886cfc397f56738b83d13023"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.185541 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" event={"ID":"30715f55-899e-47c8-a6f2-284ce89e38fa","Type":"ContainerStarted","Data":"c7b6d9e8fd84ec60b526d2e239fe8ab74a0f267317a049747f9428c24c285d74"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.191432 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f4cd9dd-25d2-499d-9361-bfecfdc49547","Type":"ContainerStarted","Data":"553838a2e0d358ae176ff0281bd886ae0e2afd6e012586ca6fb0d4d083a01611"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.194073 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerStarted","Data":"bc02419bab85961ecc0a354c09a5aba63a0c963b55c441ab8dacf3d93a7f2801"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.195639 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerStarted","Data":"05cb6f83785a91d1bad8b609d13bfd045b019f1eaf0be8e112c9b345a32f3293"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.223495 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xjjp9" event={"ID":"485cb541-e79c-474e-b68d-34e10ee57480","Type":"ContainerStarted","Data":"74e73487c938481085f5a683833434bb42ac064614962d8c5a80d9ea10221a76"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.223543 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xjjp9" event={"ID":"485cb541-e79c-474e-b68d-34e10ee57480","Type":"ContainerStarted","Data":"c0b1a6fa1c016ef8ff656e747432a7af40685d65d5c20f0789fa7f5f6881010c"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.225959 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aaa475e2-5c3b-4298-b5a4-e291f861dd0e","Type":"ContainerStarted","Data":"0ebbe6ec0319c408c3b5a25a2cf484a43b0b0b0888b576c7c96efdeedd7cad74"} Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.250473 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xjjp9" podStartSLOduration=2.250454532 podStartE2EDuration="2.250454532s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:17.248326329 +0000 UTC m=+1253.516704364" watchObservedRunningTime="2026-01-30 18:51:17.250454532 +0000 UTC m=+1253.518832557" Jan 30 18:51:17 crc kubenswrapper[4782]: I0130 18:51:17.569859 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8ht9p"] Jan 30 18:51:17 crc kubenswrapper[4782]: W0130 18:51:17.588497 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c429234_8185_4737_91f8_a65403cc83d2.slice/crio-d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9 WatchSource:0}: Error finding container d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9: Status 404 returned error can't find the container with id d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9 Jan 30 18:51:18 crc kubenswrapper[4782]: I0130 18:51:18.243132 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" event={"ID":"7c429234-8185-4737-91f8-a65403cc83d2","Type":"ContainerStarted","Data":"0e9cacaa7b0cdad45cfb0edefec7a15f4c4225bee109c997a5cd0aee4813d2c1"} Jan 30 18:51:18 crc kubenswrapper[4782]: I0130 18:51:18.243567 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" event={"ID":"7c429234-8185-4737-91f8-a65403cc83d2","Type":"ContainerStarted","Data":"d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9"} Jan 30 18:51:18 crc kubenswrapper[4782]: I0130 18:51:18.245632 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" event={"ID":"30715f55-899e-47c8-a6f2-284ce89e38fa","Type":"ContainerStarted","Data":"3257eb30ca752fa948613865fd62ec290d05a9f28de6035143c2d1c5f3a1f875"} Jan 30 18:51:18 crc kubenswrapper[4782]: I0130 18:51:18.276983 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" podStartSLOduration=2.276961825 podStartE2EDuration="2.276961825s" podCreationTimestamp="2026-01-30 18:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:18.272435303 +0000 UTC m=+1254.540813328" watchObservedRunningTime="2026-01-30 18:51:18.276961825 +0000 UTC m=+1254.545339850" Jan 30 18:51:18 crc kubenswrapper[4782]: I0130 18:51:18.307349 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" podStartSLOduration=3.307327886 podStartE2EDuration="3.307327886s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:18.295480483 +0000 UTC m=+1254.563858508" watchObservedRunningTime="2026-01-30 18:51:18.307327886 +0000 UTC m=+1254.575705911" Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.190715 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.199200 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.254975 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.793294 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.793360 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.793412 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.794096 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:51:19 crc kubenswrapper[4782]: I0130 18:51:19.794180 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126" gracePeriod=600 Jan 30 18:51:20 crc kubenswrapper[4782]: I0130 18:51:20.268175 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126" exitCode=0 Jan 30 18:51:20 crc kubenswrapper[4782]: I0130 18:51:20.268236 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126"} Jan 30 18:51:20 crc kubenswrapper[4782]: I0130 18:51:20.268296 4782 scope.go:117] "RemoveContainer" containerID="fab30dd15f3ee1b70d16c1b0ecbd40cb333806a11c1e80db2f89491cb55d627e" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.277725 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f4cd9dd-25d2-499d-9361-bfecfdc49547","Type":"ContainerStarted","Data":"79dacf1e12cdbe932432a7b02ab32d021eaf66b9041594df821d25a32203eeb0"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.279620 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerStarted","Data":"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.279664 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerStarted","Data":"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.279818 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-log" containerID="cri-o://09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" gracePeriod=30 Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.279836 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-metadata" containerID="cri-o://387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" gracePeriod=30 Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.283578 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.285569 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerStarted","Data":"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.285603 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerStarted","Data":"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.287569 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aaa475e2-5c3b-4298-b5a4-e291f861dd0e","Type":"ContainerStarted","Data":"0a49419c06d22746ddf8821b09e15a574fbc7199bdac93744652ca27226eafc0"} Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.287657 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0a49419c06d22746ddf8821b09e15a574fbc7199bdac93744652ca27226eafc0" gracePeriod=30 Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.303090 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.402782427 podStartE2EDuration="6.303072175s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="2026-01-30 18:51:16.586331786 +0000 UTC m=+1252.854709811" lastFinishedPulling="2026-01-30 18:51:20.486621534 +0000 UTC m=+1256.754999559" observedRunningTime="2026-01-30 18:51:21.291585361 +0000 UTC m=+1257.559963386" watchObservedRunningTime="2026-01-30 18:51:21.303072175 +0000 UTC m=+1257.571450200" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.321354 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.423809746 podStartE2EDuration="6.321333806s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="2026-01-30 18:51:16.586335206 +0000 UTC m=+1252.854713231" lastFinishedPulling="2026-01-30 18:51:20.483859266 +0000 UTC m=+1256.752237291" observedRunningTime="2026-01-30 18:51:21.316123398 +0000 UTC m=+1257.584501433" watchObservedRunningTime="2026-01-30 18:51:21.321333806 +0000 UTC m=+1257.589711831" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.358061 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.736801153 podStartE2EDuration="6.358038404s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="2026-01-30 18:51:16.865436555 +0000 UTC m=+1253.133814580" lastFinishedPulling="2026-01-30 18:51:20.486673766 +0000 UTC m=+1256.755051831" observedRunningTime="2026-01-30 18:51:21.352203159 +0000 UTC m=+1257.620581184" watchObservedRunningTime="2026-01-30 18:51:21.358038404 +0000 UTC m=+1257.626416429" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.377958 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.337885232 podStartE2EDuration="6.377938906s" podCreationTimestamp="2026-01-30 18:51:15 +0000 UTC" firstStartedPulling="2026-01-30 18:51:16.444953561 +0000 UTC m=+1252.713331586" lastFinishedPulling="2026-01-30 18:51:20.485007235 +0000 UTC m=+1256.753385260" observedRunningTime="2026-01-30 18:51:21.366448411 +0000 UTC m=+1257.634826436" watchObservedRunningTime="2026-01-30 18:51:21.377938906 +0000 UTC m=+1257.646316931" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.877030 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.935030 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khr5n\" (UniqueName: \"kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n\") pod \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.935402 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle\") pod \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.935563 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs\") pod \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.935666 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data\") pod \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\" (UID: \"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd\") " Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.936713 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs" (OuterVolumeSpecName: "logs") pod "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" (UID: "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.940646 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n" (OuterVolumeSpecName: "kube-api-access-khr5n") pod "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" (UID: "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd"). InnerVolumeSpecName "kube-api-access-khr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.963829 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" (UID: "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:21 crc kubenswrapper[4782]: I0130 18:51:21.981003 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data" (OuterVolumeSpecName: "config-data") pod "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" (UID: "52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.038221 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.038281 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.038295 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khr5n\" (UniqueName: \"kubernetes.io/projected/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-kube-api-access-khr5n\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.038308 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.300410 4782 generic.go:334] "Generic (PLEG): container finished" podID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerID="387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" exitCode=0 Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.300455 4782 generic.go:334] "Generic (PLEG): container finished" podID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerID="09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" exitCode=143 Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.301531 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.304421 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerDied","Data":"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec"} Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.304493 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerDied","Data":"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333"} Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.304504 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd","Type":"ContainerDied","Data":"bc02419bab85961ecc0a354c09a5aba63a0c963b55c441ab8dacf3d93a7f2801"} Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.304537 4782 scope.go:117] "RemoveContainer" containerID="387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.345898 4782 scope.go:117] "RemoveContainer" containerID="09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.349469 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.358158 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.371674 4782 scope.go:117] "RemoveContainer" containerID="387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" Jan 30 18:51:22 crc kubenswrapper[4782]: E0130 18:51:22.375389 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec\": container with ID starting with 387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec not found: ID does not exist" containerID="387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.375459 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec"} err="failed to get container status \"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec\": rpc error: code = NotFound desc = could not find container \"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec\": container with ID starting with 387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec not found: ID does not exist" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.375511 4782 scope.go:117] "RemoveContainer" containerID="09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" Jan 30 18:51:22 crc kubenswrapper[4782]: E0130 18:51:22.376077 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333\": container with ID starting with 09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333 not found: ID does not exist" containerID="09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.376112 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333"} err="failed to get container status \"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333\": rpc error: code = NotFound desc = could not find container \"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333\": container with ID starting with 09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333 not found: ID does not exist" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.376147 4782 scope.go:117] "RemoveContainer" containerID="387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.376600 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec"} err="failed to get container status \"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec\": rpc error: code = NotFound desc = could not find container \"387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec\": container with ID starting with 387206e480ca6d5535d8dece97f57636c229c7e4a1e2d5ccd3752f5706b9a3ec not found: ID does not exist" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.376640 4782 scope.go:117] "RemoveContainer" containerID="09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.377163 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333"} err="failed to get container status \"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333\": rpc error: code = NotFound desc = could not find container \"09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333\": container with ID starting with 09762037cbcc3054b81490073752ecb17f940eee7c0f5a37157f8d5c9c85c333 not found: ID does not exist" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.385410 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:22 crc kubenswrapper[4782]: E0130 18:51:22.385885 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-log" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.385901 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-log" Jan 30 18:51:22 crc kubenswrapper[4782]: E0130 18:51:22.385918 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-metadata" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.385924 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-metadata" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.398165 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-metadata" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.398204 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" containerName="nova-metadata-log" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.399291 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.399374 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.404364 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.404537 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.432082 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd" path="/var/lib/kubelet/pods/52d42ed7-dbf3-4c17-84a4-a2bbcf5c08bd/volumes" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.551434 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.551583 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.551621 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.551721 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.551794 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt2df\" (UniqueName: \"kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.653157 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.653488 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.653517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.653588 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.653637 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt2df\" (UniqueName: \"kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.654244 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.668017 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.668517 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.668873 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.671710 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt2df\" (UniqueName: \"kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df\") pod \"nova-metadata-0\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " pod="openstack/nova-metadata-0" Jan 30 18:51:22 crc kubenswrapper[4782]: I0130 18:51:22.762505 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:23 crc kubenswrapper[4782]: I0130 18:51:23.330746 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:24 crc kubenswrapper[4782]: I0130 18:51:24.324169 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerStarted","Data":"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909"} Jan 30 18:51:24 crc kubenswrapper[4782]: I0130 18:51:24.324785 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerStarted","Data":"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f"} Jan 30 18:51:24 crc kubenswrapper[4782]: I0130 18:51:24.324804 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerStarted","Data":"42a8ffa85c7fc22761111d0c93f09d5ee8fd53dea771bff319b4fc6c9c0cd095"} Jan 30 18:51:24 crc kubenswrapper[4782]: I0130 18:51:24.345477 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.345447208 podStartE2EDuration="2.345447208s" podCreationTimestamp="2026-01-30 18:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:24.341206533 +0000 UTC m=+1260.609584568" watchObservedRunningTime="2026-01-30 18:51:24.345447208 +0000 UTC m=+1260.613825243" Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.335139 4782 generic.go:334] "Generic (PLEG): container finished" podID="485cb541-e79c-474e-b68d-34e10ee57480" containerID="74e73487c938481085f5a683833434bb42ac064614962d8c5a80d9ea10221a76" exitCode=0 Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.335319 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xjjp9" event={"ID":"485cb541-e79c-474e-b68d-34e10ee57480","Type":"ContainerDied","Data":"74e73487c938481085f5a683833434bb42ac064614962d8c5a80d9ea10221a76"} Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.855388 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.855805 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.869829 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.869896 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:51:25 crc kubenswrapper[4782]: I0130 18:51:25.890804 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.164698 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.187625 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.260744 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.260989 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="dnsmasq-dns" containerID="cri-o://cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59" gracePeriod=10 Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.456077 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.952677 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:26 crc kubenswrapper[4782]: I0130 18:51:26.952982 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.142168 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.147612 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275209 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275281 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts\") pod \"485cb541-e79c-474e-b68d-34e10ee57480\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275326 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7cgt\" (UniqueName: \"kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275358 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275447 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data\") pod \"485cb541-e79c-474e-b68d-34e10ee57480\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275572 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275609 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275651 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0\") pod \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\" (UID: \"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275688 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r4lq\" (UniqueName: \"kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq\") pod \"485cb541-e79c-474e-b68d-34e10ee57480\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.275771 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle\") pod \"485cb541-e79c-474e-b68d-34e10ee57480\" (UID: \"485cb541-e79c-474e-b68d-34e10ee57480\") " Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.310084 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq" (OuterVolumeSpecName: "kube-api-access-6r4lq") pod "485cb541-e79c-474e-b68d-34e10ee57480" (UID: "485cb541-e79c-474e-b68d-34e10ee57480"). InnerVolumeSpecName "kube-api-access-6r4lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.310294 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt" (OuterVolumeSpecName: "kube-api-access-w7cgt") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "kube-api-access-w7cgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.314749 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts" (OuterVolumeSpecName: "scripts") pod "485cb541-e79c-474e-b68d-34e10ee57480" (UID: "485cb541-e79c-474e-b68d-34e10ee57480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.356167 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.357764 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config" (OuterVolumeSpecName: "config") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.364956 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.365513 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.368326 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485cb541-e79c-474e-b68d-34e10ee57480" (UID: "485cb541-e79c-474e-b68d-34e10ee57480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379162 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379201 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379213 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7cgt\" (UniqueName: \"kubernetes.io/projected/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-kube-api-access-w7cgt\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379223 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379275 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379284 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379292 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r4lq\" (UniqueName: \"kubernetes.io/projected/485cb541-e79c-474e-b68d-34e10ee57480-kube-api-access-6r4lq\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.379300 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.390033 4782 generic.go:334] "Generic (PLEG): container finished" podID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerID="cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59" exitCode=0 Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.390095 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" event={"ID":"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5","Type":"ContainerDied","Data":"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59"} Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.390122 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" event={"ID":"ef7e51ec-a2d1-476c-a79f-05b16ed15aa5","Type":"ContainerDied","Data":"7705a502c2ec01a5712891bc8467e5e87c5727bc7c46e19ff2a77e3e962d8bbe"} Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.390137 4782 scope.go:117] "RemoveContainer" containerID="cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.390286 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.400678 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xjjp9" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.400787 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xjjp9" event={"ID":"485cb541-e79c-474e-b68d-34e10ee57480","Type":"ContainerDied","Data":"c0b1a6fa1c016ef8ff656e747432a7af40685d65d5c20f0789fa7f5f6881010c"} Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.400830 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b1a6fa1c016ef8ff656e747432a7af40685d65d5c20f0789fa7f5f6881010c" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.411390 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data" (OuterVolumeSpecName: "config-data") pod "485cb541-e79c-474e-b68d-34e10ee57480" (UID: "485cb541-e79c-474e-b68d-34e10ee57480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.412738 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" (UID: "ef7e51ec-a2d1-476c-a79f-05b16ed15aa5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.468318 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.468550 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-log" containerID="cri-o://7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2" gracePeriod=30 Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.468674 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-api" containerID="cri-o://922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8" gracePeriod=30 Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.481370 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485cb541-e79c-474e-b68d-34e10ee57480-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.481398 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.513814 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.514128 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-log" containerID="cri-o://ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" gracePeriod=30 Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.514359 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-metadata" containerID="cri-o://595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" gracePeriod=30 Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.521079 4782 scope.go:117] "RemoveContainer" containerID="7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.545608 4782 scope.go:117] "RemoveContainer" containerID="cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59" Jan 30 18:51:27 crc kubenswrapper[4782]: E0130 18:51:27.546351 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59\": container with ID starting with cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59 not found: ID does not exist" containerID="cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.546410 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59"} err="failed to get container status \"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59\": rpc error: code = NotFound desc = could not find container \"cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59\": container with ID starting with cd81477fb1bd0f95306bf2f0b79bd573b3a9d4a789683a85149b70337de61f59 not found: ID does not exist" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.546458 4782 scope.go:117] "RemoveContainer" containerID="7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe" Jan 30 18:51:27 crc kubenswrapper[4782]: E0130 18:51:27.546715 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe\": container with ID starting with 7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe not found: ID does not exist" containerID="7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.546741 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe"} err="failed to get container status \"7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe\": rpc error: code = NotFound desc = could not find container \"7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe\": container with ID starting with 7cabed87fb72aefbbad125dde176c677bc30b0d6e34016f964003e273c918ebe not found: ID does not exist" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.652057 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.763424 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.763483 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.766979 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:51:27 crc kubenswrapper[4782]: I0130 18:51:27.775664 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b79bb54f5-2wxwg"] Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.039512 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.093639 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs\") pod \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.093707 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt2df\" (UniqueName: \"kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df\") pod \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.093918 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle\") pod \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.093967 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs\") pod \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.093993 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data\") pod \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\" (UID: \"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2\") " Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.097504 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs" (OuterVolumeSpecName: "logs") pod "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" (UID: "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.103489 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df" (OuterVolumeSpecName: "kube-api-access-jt2df") pod "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" (UID: "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2"). InnerVolumeSpecName "kube-api-access-jt2df". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.136445 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" (UID: "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.140377 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data" (OuterVolumeSpecName: "config-data") pod "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" (UID: "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.169479 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" (UID: "c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.195794 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.195827 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.195838 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.195848 4782 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.195858 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt2df\" (UniqueName: \"kubernetes.io/projected/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2-kube-api-access-jt2df\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.411038 4782 generic.go:334] "Generic (PLEG): container finished" podID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerID="595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" exitCode=0 Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.411074 4782 generic.go:334] "Generic (PLEG): container finished" podID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerID="ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" exitCode=143 Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.411159 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.434856 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" path="/var/lib/kubelet/pods/ef7e51ec-a2d1-476c-a79f-05b16ed15aa5/volumes" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.435401 4782 generic.go:334] "Generic (PLEG): container finished" podID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerID="7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2" exitCode=143 Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.435587 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" containerName="nova-scheduler-scheduler" containerID="cri-o://79dacf1e12cdbe932432a7b02ab32d021eaf66b9041594df821d25a32203eeb0" gracePeriod=30 Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.436016 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerDied","Data":"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909"} Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.436047 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerDied","Data":"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f"} Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.436059 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2","Type":"ContainerDied","Data":"42a8ffa85c7fc22761111d0c93f09d5ee8fd53dea771bff319b4fc6c9c0cd095"} Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.436071 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerDied","Data":"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2"} Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.436089 4782 scope.go:117] "RemoveContainer" containerID="595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.481045 4782 scope.go:117] "RemoveContainer" containerID="ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.490796 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.501887 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.507210 4782 scope.go:117] "RemoveContainer" containerID="595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.507681 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909\": container with ID starting with 595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909 not found: ID does not exist" containerID="595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.507711 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909"} err="failed to get container status \"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909\": rpc error: code = NotFound desc = could not find container \"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909\": container with ID starting with 595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909 not found: ID does not exist" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.507732 4782 scope.go:117] "RemoveContainer" containerID="ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.507902 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f\": container with ID starting with ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f not found: ID does not exist" containerID="ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.507920 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f"} err="failed to get container status \"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f\": rpc error: code = NotFound desc = could not find container \"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f\": container with ID starting with ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f not found: ID does not exist" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.507935 4782 scope.go:117] "RemoveContainer" containerID="595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.508090 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909"} err="failed to get container status \"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909\": rpc error: code = NotFound desc = could not find container \"595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909\": container with ID starting with 595d681b19cb7e72e77a7896efddd238ec2b050c63dcd87b5f60aa2f89557909 not found: ID does not exist" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.508107 4782 scope.go:117] "RemoveContainer" containerID="ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.508345 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f"} err="failed to get container status \"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f\": rpc error: code = NotFound desc = could not find container \"ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f\": container with ID starting with ddf54fa981a7d93f729b28b7e28191a5a6baa4ebb3cf471cbffcc2e6fa798f6f not found: ID does not exist" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.518343 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.524651 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-metadata" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524687 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-metadata" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.524702 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-log" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524710 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-log" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.524730 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485cb541-e79c-474e-b68d-34e10ee57480" containerName="nova-manage" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524738 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="485cb541-e79c-474e-b68d-34e10ee57480" containerName="nova-manage" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.524761 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="dnsmasq-dns" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524767 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="dnsmasq-dns" Jan 30 18:51:28 crc kubenswrapper[4782]: E0130 18:51:28.524775 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="init" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524780 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="init" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524952 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-metadata" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524969 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="485cb541-e79c-474e-b68d-34e10ee57480" containerName="nova-manage" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524985 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="dnsmasq-dns" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.524991 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" containerName="nova-metadata-log" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.525979 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.529222 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.529431 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.533666 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.603428 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.603510 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.603999 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp57p\" (UniqueName: \"kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.604088 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.604202 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.706163 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.706302 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.706349 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.706414 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp57p\" (UniqueName: \"kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.706439 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.710823 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.711986 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.712814 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.714948 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.729779 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp57p\" (UniqueName: \"kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p\") pod \"nova-metadata-0\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " pod="openstack/nova-metadata-0" Jan 30 18:51:28 crc kubenswrapper[4782]: I0130 18:51:28.873040 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.362407 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:51:29 crc kubenswrapper[4782]: W0130 18:51:29.364710 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1950feda_6261_4e6d_8edd_26caa31998b4.slice/crio-699ddb65b6e83015efa8e31dd86fe8806365451f69d2ce2ea2194023a6ad0214 WatchSource:0}: Error finding container 699ddb65b6e83015efa8e31dd86fe8806365451f69d2ce2ea2194023a6ad0214: Status 404 returned error can't find the container with id 699ddb65b6e83015efa8e31dd86fe8806365451f69d2ce2ea2194023a6ad0214 Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.449551 4782 generic.go:334] "Generic (PLEG): container finished" podID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" containerID="79dacf1e12cdbe932432a7b02ab32d021eaf66b9041594df821d25a32203eeb0" exitCode=0 Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.449642 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f4cd9dd-25d2-499d-9361-bfecfdc49547","Type":"ContainerDied","Data":"79dacf1e12cdbe932432a7b02ab32d021eaf66b9041594df821d25a32203eeb0"} Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.449679 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9f4cd9dd-25d2-499d-9361-bfecfdc49547","Type":"ContainerDied","Data":"553838a2e0d358ae176ff0281bd886ae0e2afd6e012586ca6fb0d4d083a01611"} Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.449699 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="553838a2e0d358ae176ff0281bd886ae0e2afd6e012586ca6fb0d4d083a01611" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.452479 4782 generic.go:334] "Generic (PLEG): container finished" podID="7c429234-8185-4737-91f8-a65403cc83d2" containerID="0e9cacaa7b0cdad45cfb0edefec7a15f4c4225bee109c997a5cd0aee4813d2c1" exitCode=0 Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.452569 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" event={"ID":"7c429234-8185-4737-91f8-a65403cc83d2","Type":"ContainerDied","Data":"0e9cacaa7b0cdad45cfb0edefec7a15f4c4225bee109c997a5cd0aee4813d2c1"} Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.454481 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerStarted","Data":"699ddb65b6e83015efa8e31dd86fe8806365451f69d2ce2ea2194023a6ad0214"} Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.462725 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.522936 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data\") pod \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.523054 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8nmt\" (UniqueName: \"kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt\") pod \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.523126 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle\") pod \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\" (UID: \"9f4cd9dd-25d2-499d-9361-bfecfdc49547\") " Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.557497 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt" (OuterVolumeSpecName: "kube-api-access-g8nmt") pod "9f4cd9dd-25d2-499d-9361-bfecfdc49547" (UID: "9f4cd9dd-25d2-499d-9361-bfecfdc49547"). InnerVolumeSpecName "kube-api-access-g8nmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.600383 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data" (OuterVolumeSpecName: "config-data") pod "9f4cd9dd-25d2-499d-9361-bfecfdc49547" (UID: "9f4cd9dd-25d2-499d-9361-bfecfdc49547"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.616365 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f4cd9dd-25d2-499d-9361-bfecfdc49547" (UID: "9f4cd9dd-25d2-499d-9361-bfecfdc49547"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.632441 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.632480 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8nmt\" (UniqueName: \"kubernetes.io/projected/9f4cd9dd-25d2-499d-9361-bfecfdc49547-kube-api-access-g8nmt\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:29 crc kubenswrapper[4782]: I0130 18:51:29.632492 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f4cd9dd-25d2-499d-9361-bfecfdc49547-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.422292 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2" path="/var/lib/kubelet/pods/c7e1ac03-7309-4d8d-a1f6-4fe23a21e5e2/volumes" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.478071 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.478083 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerStarted","Data":"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80"} Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.478127 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerStarted","Data":"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94"} Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.507294 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.507273166 podStartE2EDuration="2.507273166s" podCreationTimestamp="2026-01-30 18:51:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:30.499584176 +0000 UTC m=+1266.767962201" watchObservedRunningTime="2026-01-30 18:51:30.507273166 +0000 UTC m=+1266.775651191" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.537269 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.552960 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.566504 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:30 crc kubenswrapper[4782]: E0130 18:51:30.566959 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" containerName="nova-scheduler-scheduler" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.566979 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" containerName="nova-scheduler-scheduler" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.567182 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" containerName="nova-scheduler-scheduler" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.567906 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.574373 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.577049 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.655527 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngqxs\" (UniqueName: \"kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.655611 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.655869 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.757626 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.758015 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngqxs\" (UniqueName: \"kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.758041 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.763514 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.805890 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.815343 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngqxs\" (UniqueName: \"kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs\") pod \"nova-scheduler-0\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.888426 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.898492 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.960772 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts\") pod \"7c429234-8185-4737-91f8-a65403cc83d2\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.960842 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlk6c\" (UniqueName: \"kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c\") pod \"7c429234-8185-4737-91f8-a65403cc83d2\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.960960 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data\") pod \"7c429234-8185-4737-91f8-a65403cc83d2\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.961046 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle\") pod \"7c429234-8185-4737-91f8-a65403cc83d2\" (UID: \"7c429234-8185-4737-91f8-a65403cc83d2\") " Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.965671 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c" (OuterVolumeSpecName: "kube-api-access-mlk6c") pod "7c429234-8185-4737-91f8-a65403cc83d2" (UID: "7c429234-8185-4737-91f8-a65403cc83d2"). InnerVolumeSpecName "kube-api-access-mlk6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.965998 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts" (OuterVolumeSpecName: "scripts") pod "7c429234-8185-4737-91f8-a65403cc83d2" (UID: "7c429234-8185-4737-91f8-a65403cc83d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.988590 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data" (OuterVolumeSpecName: "config-data") pod "7c429234-8185-4737-91f8-a65403cc83d2" (UID: "7c429234-8185-4737-91f8-a65403cc83d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:30 crc kubenswrapper[4782]: I0130 18:51:30.991119 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c429234-8185-4737-91f8-a65403cc83d2" (UID: "7c429234-8185-4737-91f8-a65403cc83d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.020794 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.065133 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.065169 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.065181 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c429234-8185-4737-91f8-a65403cc83d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.065189 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlk6c\" (UniqueName: \"kubernetes.io/projected/7c429234-8185-4737-91f8-a65403cc83d2-kube-api-access-mlk6c\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.166770 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data\") pod \"99a9720e-3e90-4f30-b251-5f63cb746fe3\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.166989 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle\") pod \"99a9720e-3e90-4f30-b251-5f63cb746fe3\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.167123 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmnlf\" (UniqueName: \"kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf\") pod \"99a9720e-3e90-4f30-b251-5f63cb746fe3\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.167169 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs\") pod \"99a9720e-3e90-4f30-b251-5f63cb746fe3\" (UID: \"99a9720e-3e90-4f30-b251-5f63cb746fe3\") " Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.168221 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs" (OuterVolumeSpecName: "logs") pod "99a9720e-3e90-4f30-b251-5f63cb746fe3" (UID: "99a9720e-3e90-4f30-b251-5f63cb746fe3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.173211 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf" (OuterVolumeSpecName: "kube-api-access-cmnlf") pod "99a9720e-3e90-4f30-b251-5f63cb746fe3" (UID: "99a9720e-3e90-4f30-b251-5f63cb746fe3"). InnerVolumeSpecName "kube-api-access-cmnlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.197426 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99a9720e-3e90-4f30-b251-5f63cb746fe3" (UID: "99a9720e-3e90-4f30-b251-5f63cb746fe3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.200406 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data" (OuterVolumeSpecName: "config-data") pod "99a9720e-3e90-4f30-b251-5f63cb746fe3" (UID: "99a9720e-3e90-4f30-b251-5f63cb746fe3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.269725 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmnlf\" (UniqueName: \"kubernetes.io/projected/99a9720e-3e90-4f30-b251-5f63cb746fe3-kube-api-access-cmnlf\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.269771 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a9720e-3e90-4f30-b251-5f63cb746fe3-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.269789 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.269802 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a9720e-3e90-4f30-b251-5f63cb746fe3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.382100 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: W0130 18:51:31.405086 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d111af9_ea7a_4826_a405_c14931dfd7b3.slice/crio-311123e9c69aff50930999dea4c489db12185db8fce1de11277bd0886913645c WatchSource:0}: Error finding container 311123e9c69aff50930999dea4c489db12185db8fce1de11277bd0886913645c: Status 404 returned error can't find the container with id 311123e9c69aff50930999dea4c489db12185db8fce1de11277bd0886913645c Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.495905 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.496386 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8ht9p" event={"ID":"7c429234-8185-4737-91f8-a65403cc83d2","Type":"ContainerDied","Data":"d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9"} Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.496435 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a8b11fec75f716fe3355b672af879c40e3277d0424f8f1610592ae14c1b9a9" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.506422 4782 generic.go:334] "Generic (PLEG): container finished" podID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerID="922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8" exitCode=0 Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.506497 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerDied","Data":"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8"} Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.506524 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99a9720e-3e90-4f30-b251-5f63cb746fe3","Type":"ContainerDied","Data":"05cb6f83785a91d1bad8b609d13bfd045b019f1eaf0be8e112c9b345a32f3293"} Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.506541 4782 scope.go:117] "RemoveContainer" containerID="922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.506665 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.514689 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d111af9-ea7a-4826-a405-c14931dfd7b3","Type":"ContainerStarted","Data":"311123e9c69aff50930999dea4c489db12185db8fce1de11277bd0886913645c"} Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.551932 4782 scope.go:117] "RemoveContainer" containerID="7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.575543 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: E0130 18:51:31.576559 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-api" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576579 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-api" Jan 30 18:51:31 crc kubenswrapper[4782]: E0130 18:51:31.576604 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-log" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576610 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-log" Jan 30 18:51:31 crc kubenswrapper[4782]: E0130 18:51:31.576633 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c429234-8185-4737-91f8-a65403cc83d2" containerName="nova-cell1-conductor-db-sync" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576640 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c429234-8185-4737-91f8-a65403cc83d2" containerName="nova-cell1-conductor-db-sync" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576815 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-api" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576829 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c429234-8185-4737-91f8-a65403cc83d2" containerName="nova-cell1-conductor-db-sync" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.576844 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" containerName="nova-api-log" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.577487 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.582079 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.589004 4782 scope.go:117] "RemoveContainer" containerID="922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.589092 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: E0130 18:51:31.593720 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8\": container with ID starting with 922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8 not found: ID does not exist" containerID="922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.593763 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8"} err="failed to get container status \"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8\": rpc error: code = NotFound desc = could not find container \"922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8\": container with ID starting with 922591bd1be8ceac90ec8af7fc6aa797215194fe67c5d9f0bdb2cb99c1250cd8 not found: ID does not exist" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.593788 4782 scope.go:117] "RemoveContainer" containerID="7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2" Jan 30 18:51:31 crc kubenswrapper[4782]: E0130 18:51:31.594289 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2\": container with ID starting with 7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2 not found: ID does not exist" containerID="7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.594327 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2"} err="failed to get container status \"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2\": rpc error: code = NotFound desc = could not find container \"7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2\": container with ID starting with 7a9b66ab793e47cce9443c59977f4f5466ff7046e5d82ccaacce2b7d5ad4baa2 not found: ID does not exist" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.600371 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.616717 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.625512 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.627646 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.629708 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.642608 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681387 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrx5\" (UniqueName: \"kubernetes.io/projected/71afc4ce-765f-4c71-a76e-6a4eff2b553d-kube-api-access-ltrx5\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681429 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681463 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zzp\" (UniqueName: \"kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681482 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681572 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681601 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.681762 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.772347 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b79bb54f5-2wxwg" podUID="ef7e51ec-a2d1-476c-a79f-05b16ed15aa5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.180:5353: i/o timeout" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784259 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784786 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784876 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrx5\" (UniqueName: \"kubernetes.io/projected/71afc4ce-765f-4c71-a76e-6a4eff2b553d-kube-api-access-ltrx5\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784911 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784959 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zzp\" (UniqueName: \"kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.784995 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.785092 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.785694 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.790108 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.790210 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71afc4ce-765f-4c71-a76e-6a4eff2b553d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.790761 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.790904 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.807353 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrx5\" (UniqueName: \"kubernetes.io/projected/71afc4ce-765f-4c71-a76e-6a4eff2b553d-kube-api-access-ltrx5\") pod \"nova-cell1-conductor-0\" (UID: \"71afc4ce-765f-4c71-a76e-6a4eff2b553d\") " pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.808656 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zzp\" (UniqueName: \"kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp\") pod \"nova-api-0\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " pod="openstack/nova-api-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.916569 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:31 crc kubenswrapper[4782]: I0130 18:51:31.954581 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.434589 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a9720e-3e90-4f30-b251-5f63cb746fe3" path="/var/lib/kubelet/pods/99a9720e-3e90-4f30-b251-5f63cb746fe3/volumes" Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.438384 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4cd9dd-25d2-499d-9361-bfecfdc49547" path="/var/lib/kubelet/pods/9f4cd9dd-25d2-499d-9361-bfecfdc49547/volumes" Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.439842 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.488748 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:32 crc kubenswrapper[4782]: W0130 18:51:32.510412 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fcab3e1_35ab_441d_aed7_b39057f9541f.slice/crio-95e37c42061b9ae281246319ba4d90180c18c5e56d29e8ccb54ecb956e79a4a5 WatchSource:0}: Error finding container 95e37c42061b9ae281246319ba4d90180c18c5e56d29e8ccb54ecb956e79a4a5: Status 404 returned error can't find the container with id 95e37c42061b9ae281246319ba4d90180c18c5e56d29e8ccb54ecb956e79a4a5 Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.529989 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"71afc4ce-765f-4c71-a76e-6a4eff2b553d","Type":"ContainerStarted","Data":"9ba4eed9d633bb0b75f67d0b25f919837298c7605db2e46104921071e4edea34"} Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.532383 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d111af9-ea7a-4826-a405-c14931dfd7b3","Type":"ContainerStarted","Data":"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821"} Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.533581 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerStarted","Data":"95e37c42061b9ae281246319ba4d90180c18c5e56d29e8ccb54ecb956e79a4a5"} Jan 30 18:51:32 crc kubenswrapper[4782]: I0130 18:51:32.548017 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.548000069 podStartE2EDuration="2.548000069s" podCreationTimestamp="2026-01-30 18:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:32.547116607 +0000 UTC m=+1268.815494632" watchObservedRunningTime="2026-01-30 18:51:32.548000069 +0000 UTC m=+1268.816378084" Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.544638 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerStarted","Data":"71855bea56dcb1811850e5182e09d49776be777c8b7d0b01e0bce1065310ca5c"} Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.544926 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerStarted","Data":"8193bd46b48d558be095a8524ee2dd06a10755718f16293d67574b925dfa72b2"} Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.549222 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"71afc4ce-765f-4c71-a76e-6a4eff2b553d","Type":"ContainerStarted","Data":"eeab052de665adca7b3a5b174905831161de6d67f4fa85d9ea94a717ae999d01"} Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.549281 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.569533 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.569512959 podStartE2EDuration="2.569512959s" podCreationTimestamp="2026-01-30 18:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:33.567172151 +0000 UTC m=+1269.835550196" watchObservedRunningTime="2026-01-30 18:51:33.569512959 +0000 UTC m=+1269.837890994" Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.602598 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.6025692559999998 podStartE2EDuration="2.602569256s" podCreationTimestamp="2026-01-30 18:51:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:33.596052705 +0000 UTC m=+1269.864430750" watchObservedRunningTime="2026-01-30 18:51:33.602569256 +0000 UTC m=+1269.870947291" Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.873542 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:51:33 crc kubenswrapper[4782]: I0130 18:51:33.873897 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:51:35 crc kubenswrapper[4782]: I0130 18:51:35.888967 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 18:51:38 crc kubenswrapper[4782]: I0130 18:51:38.873501 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 18:51:38 crc kubenswrapper[4782]: I0130 18:51:38.874178 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 18:51:39 crc kubenswrapper[4782]: I0130 18:51:39.885452 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:39 crc kubenswrapper[4782]: I0130 18:51:39.885467 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:40 crc kubenswrapper[4782]: I0130 18:51:40.539080 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 18:51:40 crc kubenswrapper[4782]: I0130 18:51:40.889522 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 18:51:40 crc kubenswrapper[4782]: I0130 18:51:40.926755 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 18:51:41 crc kubenswrapper[4782]: I0130 18:51:41.685089 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 18:51:41 crc kubenswrapper[4782]: I0130 18:51:41.949885 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 18:51:41 crc kubenswrapper[4782]: I0130 18:51:41.955305 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:51:41 crc kubenswrapper[4782]: I0130 18:51:41.955374 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:51:43 crc kubenswrapper[4782]: I0130 18:51:43.037427 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:43 crc kubenswrapper[4782]: I0130 18:51:43.037700 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.220:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 18:51:48 crc kubenswrapper[4782]: I0130 18:51:48.966116 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 18:51:48 crc kubenswrapper[4782]: I0130 18:51:48.968922 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 18:51:48 crc kubenswrapper[4782]: I0130 18:51:48.996307 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.710805 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.714767 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.727905 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.760189 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.810977 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.811129 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.811154 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5fjp\" (UniqueName: \"kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.913272 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.913405 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.913433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5fjp\" (UniqueName: \"kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.914136 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.914202 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:49 crc kubenswrapper[4782]: I0130 18:51:49.934050 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5fjp\" (UniqueName: \"kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp\") pod \"redhat-marketplace-8j29m\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:50 crc kubenswrapper[4782]: I0130 18:51:50.058123 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:51:50 crc kubenswrapper[4782]: I0130 18:51:50.562004 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:51:50 crc kubenswrapper[4782]: I0130 18:51:50.753188 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerStarted","Data":"e49ac394848548fc0bbfd03bade755cdc1e2cd7d2c358d3523da7d899d17a2ad"} Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.495627 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.498061 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.525404 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.558217 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.558508 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8b9\" (UniqueName: \"kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.558988 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.661934 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8b9\" (UniqueName: \"kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.662102 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.662166 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.662932 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.663535 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.696153 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8b9\" (UniqueName: \"kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9\") pod \"redhat-operators-22nc8\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.763570 4782 generic.go:334] "Generic (PLEG): container finished" podID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerID="cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99" exitCode=0 Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.763650 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerDied","Data":"cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99"} Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.766255 4782 generic.go:334] "Generic (PLEG): container finished" podID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" containerID="0a49419c06d22746ddf8821b09e15a574fbc7199bdac93744652ca27226eafc0" exitCode=137 Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.766330 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aaa475e2-5c3b-4298-b5a4-e291f861dd0e","Type":"ContainerDied","Data":"0a49419c06d22746ddf8821b09e15a574fbc7199bdac93744652ca27226eafc0"} Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.815262 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.893550 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.966578 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.967636 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhrgv\" (UniqueName: \"kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv\") pod \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.967773 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle\") pod \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.967799 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data\") pod \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\" (UID: \"aaa475e2-5c3b-4298-b5a4-e291f861dd0e\") " Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.968367 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.974096 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.974258 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv" (OuterVolumeSpecName: "kube-api-access-dhrgv") pod "aaa475e2-5c3b-4298-b5a4-e291f861dd0e" (UID: "aaa475e2-5c3b-4298-b5a4-e291f861dd0e"). InnerVolumeSpecName "kube-api-access-dhrgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:51 crc kubenswrapper[4782]: I0130 18:51:51.975203 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.009452 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data" (OuterVolumeSpecName: "config-data") pod "aaa475e2-5c3b-4298-b5a4-e291f861dd0e" (UID: "aaa475e2-5c3b-4298-b5a4-e291f861dd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.012599 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaa475e2-5c3b-4298-b5a4-e291f861dd0e" (UID: "aaa475e2-5c3b-4298-b5a4-e291f861dd0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.070333 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.070366 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.070375 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhrgv\" (UniqueName: \"kubernetes.io/projected/aaa475e2-5c3b-4298-b5a4-e291f861dd0e-kube-api-access-dhrgv\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.353014 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:51:52 crc kubenswrapper[4782]: W0130 18:51:52.356719 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod464f5bf8_cf27_4e1e_aec8_cd3decdfb0dd.slice/crio-fa9489977de2fc5ac94bb77980ac3bd9d58c0ba7fa91ba7cd13bf01ba96d8eff WatchSource:0}: Error finding container fa9489977de2fc5ac94bb77980ac3bd9d58c0ba7fa91ba7cd13bf01ba96d8eff: Status 404 returned error can't find the container with id fa9489977de2fc5ac94bb77980ac3bd9d58c0ba7fa91ba7cd13bf01ba96d8eff Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.778443 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerStarted","Data":"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9"} Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.781369 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.781830 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"aaa475e2-5c3b-4298-b5a4-e291f861dd0e","Type":"ContainerDied","Data":"0ebbe6ec0319c408c3b5a25a2cf484a43b0b0b0888b576c7c96efdeedd7cad74"} Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.781865 4782 scope.go:117] "RemoveContainer" containerID="0a49419c06d22746ddf8821b09e15a574fbc7199bdac93744652ca27226eafc0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.785835 4782 generic.go:334] "Generic (PLEG): container finished" podID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerID="3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b" exitCode=0 Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.785908 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerDied","Data":"3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b"} Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.785962 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerStarted","Data":"fa9489977de2fc5ac94bb77980ac3bd9d58c0ba7fa91ba7cd13bf01ba96d8eff"} Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.786159 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.799252 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.844863 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.853147 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.897161 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:52 crc kubenswrapper[4782]: E0130 18:51:52.897516 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.897531 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.897748 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.898372 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.913068 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.913384 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.913851 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.922158 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.986143 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.986390 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.986416 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkwj\" (UniqueName: \"kubernetes.io/projected/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-kube-api-access-hdkwj\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.986443 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:52 crc kubenswrapper[4782]: I0130 18:51:52.986483 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.091690 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.091733 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.091757 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkwj\" (UniqueName: \"kubernetes.io/projected/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-kube-api-access-hdkwj\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.091782 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.091819 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.103911 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.107080 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.111827 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.124048 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.148327 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.149918 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.177249 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkwj\" (UniqueName: \"kubernetes.io/projected/858bbcbd-4a47-42ee-a581-2b03ca45dcaa-kube-api-access-hdkwj\") pod \"nova-cell1-novncproxy-0\" (UID: \"858bbcbd-4a47-42ee-a581-2b03ca45dcaa\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.185125 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.207846 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.207931 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.208118 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.208167 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm568\" (UniqueName: \"kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.208202 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.208235 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.215076 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314368 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314623 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm568\" (UniqueName: \"kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314671 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314710 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314772 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.314829 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.315613 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.315681 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.316411 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.316865 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.317386 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.334411 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm568\" (UniqueName: \"kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568\") pod \"dnsmasq-dns-8fb4d68c5-ftns8\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.542171 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.812526 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerStarted","Data":"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64"} Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.824629 4782 generic.go:334] "Generic (PLEG): container finished" podID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerID="b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9" exitCode=0 Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.824720 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerDied","Data":"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9"} Jan 30 18:51:53 crc kubenswrapper[4782]: I0130 18:51:53.862615 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 18:51:58 crc kubenswrapper[4782]: W0130 18:51:54.113319 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd01ba4_1545_47eb_8aed_7b7c23b939b5.slice/crio-8a40ca7096e01681b6cf3892811423049f51936fb2c4e4d3a7e8dbd9083a0ea1 WatchSource:0}: Error finding container 8a40ca7096e01681b6cf3892811423049f51936fb2c4e4d3a7e8dbd9083a0ea1: Status 404 returned error can't find the container with id 8a40ca7096e01681b6cf3892811423049f51936fb2c4e4d3a7e8dbd9083a0ea1 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.128783 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.426780 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa475e2-5c3b-4298-b5a4-e291f861dd0e" path="/var/lib/kubelet/pods/aaa475e2-5c3b-4298-b5a4-e291f861dd0e/volumes" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.842142 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"858bbcbd-4a47-42ee-a581-2b03ca45dcaa","Type":"ContainerStarted","Data":"83083a55143dbb55cdd8d96c37c5204b1dd0d5f3640a5359cd07f54fea031213"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.842437 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"858bbcbd-4a47-42ee-a581-2b03ca45dcaa","Type":"ContainerStarted","Data":"7fd430bd8801e70d62f42977dfe797210de7faf138c9d6ee79fb9bf1686f3cf3"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.844256 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerStarted","Data":"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.844302 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerStarted","Data":"8a40ca7096e01681b6cf3892811423049f51936fb2c4e4d3a7e8dbd9083a0ea1"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:54.864689 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.864669677 podStartE2EDuration="2.864669677s" podCreationTimestamp="2026-01-30 18:51:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:54.857834018 +0000 UTC m=+1291.126212043" watchObservedRunningTime="2026-01-30 18:51:54.864669677 +0000 UTC m=+1291.133047702" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:55.918847 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:55.919403 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-api" containerID="cri-o://71855bea56dcb1811850e5182e09d49776be777c8b7d0b01e0bce1065310ca5c" gracePeriod=30 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:55.919653 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-log" containerID="cri-o://8193bd46b48d558be095a8524ee2dd06a10755718f16293d67574b925dfa72b2" gracePeriod=30 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.864014 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerStarted","Data":"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.866554 4782 generic.go:334] "Generic (PLEG): container finished" podID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerID="fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf" exitCode=0 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.866593 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerDied","Data":"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.877046 4782 generic.go:334] "Generic (PLEG): container finished" podID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerID="8193bd46b48d558be095a8524ee2dd06a10755718f16293d67574b925dfa72b2" exitCode=143 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.877130 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerDied","Data":"8193bd46b48d558be095a8524ee2dd06a10755718f16293d67574b925dfa72b2"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.881982 4782 generic.go:334] "Generic (PLEG): container finished" podID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerID="b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64" exitCode=0 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:56.882031 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerDied","Data":"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.216138 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.902618 4782 generic.go:334] "Generic (PLEG): container finished" podID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerID="71855bea56dcb1811850e5182e09d49776be777c8b7d0b01e0bce1065310ca5c" exitCode=0 Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.902879 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerDied","Data":"71855bea56dcb1811850e5182e09d49776be777c8b7d0b01e0bce1065310ca5c"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.906091 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerStarted","Data":"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.909460 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerStarted","Data":"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1"} Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.910122 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.935326 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-22nc8" podStartSLOduration=2.349496655 podStartE2EDuration="7.935299496s" podCreationTimestamp="2026-01-30 18:51:51 +0000 UTC" firstStartedPulling="2026-01-30 18:51:52.787737789 +0000 UTC m=+1289.056115814" lastFinishedPulling="2026-01-30 18:51:58.37354061 +0000 UTC m=+1294.641918655" observedRunningTime="2026-01-30 18:51:58.924275603 +0000 UTC m=+1295.192653628" watchObservedRunningTime="2026-01-30 18:51:58.935299496 +0000 UTC m=+1295.203677531" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.945132 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8j29m" podStartSLOduration=6.660738445 podStartE2EDuration="9.945087818s" podCreationTimestamp="2026-01-30 18:51:49 +0000 UTC" firstStartedPulling="2026-01-30 18:51:51.764692391 +0000 UTC m=+1288.033070416" lastFinishedPulling="2026-01-30 18:51:55.049041754 +0000 UTC m=+1291.317419789" observedRunningTime="2026-01-30 18:51:58.944015781 +0000 UTC m=+1295.212393806" watchObservedRunningTime="2026-01-30 18:51:58.945087818 +0000 UTC m=+1295.213465853" Jan 30 18:51:58 crc kubenswrapper[4782]: I0130 18:51:58.980200 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" podStartSLOduration=5.980177615 podStartE2EDuration="5.980177615s" podCreationTimestamp="2026-01-30 18:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:51:58.969816519 +0000 UTC m=+1295.238194554" watchObservedRunningTime="2026-01-30 18:51:58.980177615 +0000 UTC m=+1295.248555640" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.077463 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.147871 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs\") pod \"4fcab3e1-35ab-441d-aed7-b39057f9541f\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.148108 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8zzp\" (UniqueName: \"kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp\") pod \"4fcab3e1-35ab-441d-aed7-b39057f9541f\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.148257 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle\") pod \"4fcab3e1-35ab-441d-aed7-b39057f9541f\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.148295 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data\") pod \"4fcab3e1-35ab-441d-aed7-b39057f9541f\" (UID: \"4fcab3e1-35ab-441d-aed7-b39057f9541f\") " Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.151545 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs" (OuterVolumeSpecName: "logs") pod "4fcab3e1-35ab-441d-aed7-b39057f9541f" (UID: "4fcab3e1-35ab-441d-aed7-b39057f9541f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.184475 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp" (OuterVolumeSpecName: "kube-api-access-m8zzp") pod "4fcab3e1-35ab-441d-aed7-b39057f9541f" (UID: "4fcab3e1-35ab-441d-aed7-b39057f9541f"). InnerVolumeSpecName "kube-api-access-m8zzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.196372 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fcab3e1-35ab-441d-aed7-b39057f9541f" (UID: "4fcab3e1-35ab-441d-aed7-b39057f9541f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.217375 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data" (OuterVolumeSpecName: "config-data") pod "4fcab3e1-35ab-441d-aed7-b39057f9541f" (UID: "4fcab3e1-35ab-441d-aed7-b39057f9541f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.251106 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8zzp\" (UniqueName: \"kubernetes.io/projected/4fcab3e1-35ab-441d-aed7-b39057f9541f-kube-api-access-m8zzp\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.251140 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.251149 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fcab3e1-35ab-441d-aed7-b39057f9541f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.251158 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fcab3e1-35ab-441d-aed7-b39057f9541f-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.936446 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.937081 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4fcab3e1-35ab-441d-aed7-b39057f9541f","Type":"ContainerDied","Data":"95e37c42061b9ae281246319ba4d90180c18c5e56d29e8ccb54ecb956e79a4a5"} Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.937122 4782 scope.go:117] "RemoveContainer" containerID="71855bea56dcb1811850e5182e09d49776be777c8b7d0b01e0bce1065310ca5c" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.977987 4782 scope.go:117] "RemoveContainer" containerID="8193bd46b48d558be095a8524ee2dd06a10755718f16293d67574b925dfa72b2" Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.981666 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:51:59 crc kubenswrapper[4782]: I0130 18:51:59.990979 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.006304 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:00 crc kubenswrapper[4782]: E0130 18:52:00.006717 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-api" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.006735 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-api" Jan 30 18:52:00 crc kubenswrapper[4782]: E0130 18:52:00.006753 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-log" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.006761 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-log" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.006953 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-log" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.006972 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" containerName="nova-api-api" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.012376 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.017174 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.017708 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.020851 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.053376 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.061499 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.061548 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071051 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071109 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071156 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071319 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071380 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qdg\" (UniqueName: \"kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.071403 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.117951 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172569 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172667 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qdg\" (UniqueName: \"kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172697 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172737 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172756 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.172806 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.173472 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.177161 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.179842 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.179889 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.180374 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.190030 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qdg\" (UniqueName: \"kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg\") pod \"nova-api-0\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.363595 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.426902 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fcab3e1-35ab-441d-aed7-b39057f9541f" path="/var/lib/kubelet/pods/4fcab3e1-35ab-441d-aed7-b39057f9541f/volumes" Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.444652 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.444939 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-central-agent" containerID="cri-o://8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a" gracePeriod=30 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.445027 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="sg-core" containerID="cri-o://83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b" gracePeriod=30 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.445095 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="proxy-httpd" containerID="cri-o://fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac" gracePeriod=30 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.445070 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-notification-agent" containerID="cri-o://2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12" gracePeriod=30 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.917211 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:00 crc kubenswrapper[4782]: W0130 18:52:00.928440 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2610647_0a40_45d9_806c_6d1e737caf21.slice/crio-80ff495a02e375c74b0928e8cebc3494da7d503df3134ff243f4f9919850283e WatchSource:0}: Error finding container 80ff495a02e375c74b0928e8cebc3494da7d503df3134ff243f4f9919850283e: Status 404 returned error can't find the container with id 80ff495a02e375c74b0928e8cebc3494da7d503df3134ff243f4f9919850283e Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950174 4782 generic.go:334] "Generic (PLEG): container finished" podID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerID="fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac" exitCode=0 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950203 4782 generic.go:334] "Generic (PLEG): container finished" podID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerID="83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b" exitCode=2 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950211 4782 generic.go:334] "Generic (PLEG): container finished" podID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerID="8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a" exitCode=0 Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950255 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerDied","Data":"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac"} Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950277 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerDied","Data":"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b"} Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.950287 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerDied","Data":"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a"} Jan 30 18:52:00 crc kubenswrapper[4782]: I0130 18:52:00.951577 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerStarted","Data":"80ff495a02e375c74b0928e8cebc3494da7d503df3134ff243f4f9919850283e"} Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.035826 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.817028 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.817490 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.883458 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.968562 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerStarted","Data":"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101"} Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.968618 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerStarted","Data":"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056"} Jan 30 18:52:01 crc kubenswrapper[4782]: I0130 18:52:01.995569 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.995545469 podStartE2EDuration="2.995545469s" podCreationTimestamp="2026-01-30 18:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:52:01.987632664 +0000 UTC m=+1298.256010699" watchObservedRunningTime="2026-01-30 18:52:01.995545469 +0000 UTC m=+1298.263923504" Jan 30 18:52:02 crc kubenswrapper[4782]: I0130 18:52:02.878010 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-22nc8" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="registry-server" probeResult="failure" output=< Jan 30 18:52:02 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 18:52:02 crc kubenswrapper[4782]: > Jan 30 18:52:02 crc kubenswrapper[4782]: I0130 18:52:02.990976 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8j29m" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="registry-server" containerID="cri-o://f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16" gracePeriod=2 Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.216508 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.235375 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.528874 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.544360 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.616142 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.616384 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="dnsmasq-dns" containerID="cri-o://3257eb30ca752fa948613865fd62ec290d05a9f28de6035143c2d1c5f3a1f875" gracePeriod=10 Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.652710 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5fjp\" (UniqueName: \"kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp\") pod \"16d92c41-9da2-4d53-ac41-7f12943e6360\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.652872 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content\") pod \"16d92c41-9da2-4d53-ac41-7f12943e6360\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.652909 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities\") pod \"16d92c41-9da2-4d53-ac41-7f12943e6360\" (UID: \"16d92c41-9da2-4d53-ac41-7f12943e6360\") " Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.654069 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities" (OuterVolumeSpecName: "utilities") pod "16d92c41-9da2-4d53-ac41-7f12943e6360" (UID: "16d92c41-9da2-4d53-ac41-7f12943e6360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.661974 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp" (OuterVolumeSpecName: "kube-api-access-r5fjp") pod "16d92c41-9da2-4d53-ac41-7f12943e6360" (UID: "16d92c41-9da2-4d53-ac41-7f12943e6360"). InnerVolumeSpecName "kube-api-access-r5fjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.682531 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16d92c41-9da2-4d53-ac41-7f12943e6360" (UID: "16d92c41-9da2-4d53-ac41-7f12943e6360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.755616 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5fjp\" (UniqueName: \"kubernetes.io/projected/16d92c41-9da2-4d53-ac41-7f12943e6360-kube-api-access-r5fjp\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.756028 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:03 crc kubenswrapper[4782]: I0130 18:52:03.756045 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16d92c41-9da2-4d53-ac41-7f12943e6360-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.002444 4782 generic.go:334] "Generic (PLEG): container finished" podID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerID="3257eb30ca752fa948613865fd62ec290d05a9f28de6035143c2d1c5f3a1f875" exitCode=0 Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.002522 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" event={"ID":"30715f55-899e-47c8-a6f2-284ce89e38fa","Type":"ContainerDied","Data":"3257eb30ca752fa948613865fd62ec290d05a9f28de6035143c2d1c5f3a1f875"} Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.004384 4782 generic.go:334] "Generic (PLEG): container finished" podID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerID="f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16" exitCode=0 Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.005401 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerDied","Data":"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16"} Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.005459 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8j29m" event={"ID":"16d92c41-9da2-4d53-ac41-7f12943e6360","Type":"ContainerDied","Data":"e49ac394848548fc0bbfd03bade755cdc1e2cd7d2c358d3523da7d899d17a2ad"} Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.005459 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8j29m" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.005477 4782 scope.go:117] "RemoveContainer" containerID="f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.024792 4782 scope.go:117] "RemoveContainer" containerID="b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.032454 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.051716 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.079589 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8j29m"] Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.083900 4782 scope.go:117] "RemoveContainer" containerID="cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.109204 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.123509 4782 scope.go:117] "RemoveContainer" containerID="f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.127361 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16\": container with ID starting with f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16 not found: ID does not exist" containerID="f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.127425 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16"} err="failed to get container status \"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16\": rpc error: code = NotFound desc = could not find container \"f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16\": container with ID starting with f1e01c6244302704fb4893923ab5a608bda079c18d7f8f68f7a4a3ed496b7c16 not found: ID does not exist" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.127458 4782 scope.go:117] "RemoveContainer" containerID="b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.128181 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9\": container with ID starting with b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9 not found: ID does not exist" containerID="b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.128239 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9"} err="failed to get container status \"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9\": rpc error: code = NotFound desc = could not find container \"b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9\": container with ID starting with b384e1a2b228bae5631bebfd3c2f5fdaf6f324aea3b38b7b188399ee164948c9 not found: ID does not exist" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.128257 4782 scope.go:117] "RemoveContainer" containerID="cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.130444 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99\": container with ID starting with cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99 not found: ID does not exist" containerID="cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.130491 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99"} err="failed to get container status \"cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99\": rpc error: code = NotFound desc = could not find container \"cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99\": container with ID starting with cb4680d58436c1b7d7fbed0f09897033075c49b2a2d139bc84506089881b3c99 not found: ID does not exist" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169002 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169074 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169192 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169339 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169461 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g77k2\" (UniqueName: \"kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.169512 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.185653 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2" (OuterVolumeSpecName: "kube-api-access-g77k2") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "kube-api-access-g77k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.234835 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.243427 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.250114 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.261806 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config" (OuterVolumeSpecName: "config") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272151 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272356 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") pod \"30715f55-899e-47c8-a6f2-284ce89e38fa\" (UID: \"30715f55-899e-47c8-a6f2-284ce89e38fa\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272833 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272852 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272862 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g77k2\" (UniqueName: \"kubernetes.io/projected/30715f55-899e-47c8-a6f2-284ce89e38fa-kube-api-access-g77k2\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272874 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272883 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: W0130 18:52:04.272967 4782 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/30715f55-899e-47c8-a6f2-284ce89e38fa/volumes/kubernetes.io~configmap/dns-swift-storage-0 Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.272978 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "30715f55-899e-47c8-a6f2-284ce89e38fa" (UID: "30715f55-899e-47c8-a6f2-284ce89e38fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.289774 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xwqx2"] Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.290255 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="registry-server" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290270 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="registry-server" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.290292 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="dnsmasq-dns" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290301 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="dnsmasq-dns" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.290311 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="init" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290319 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="init" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.290333 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="extract-content" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290342 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="extract-content" Jan 30 18:52:04 crc kubenswrapper[4782]: E0130 18:52:04.290356 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="extract-utilities" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290373 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="extract-utilities" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290571 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" containerName="registry-server" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.290595 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" containerName="dnsmasq-dns" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.291338 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.294197 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.294399 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.314295 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xwqx2"] Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.374038 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.374090 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mg2n\" (UniqueName: \"kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.374118 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.374180 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.374312 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/30715f55-899e-47c8-a6f2-284ce89e38fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.422468 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d92c41-9da2-4d53-ac41-7f12943e6360" path="/var/lib/kubelet/pods/16d92c41-9da2-4d53-ac41-7f12943e6360/volumes" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.475516 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.475553 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mg2n\" (UniqueName: \"kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.475578 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.475643 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.481028 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.481093 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.493965 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.510458 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mg2n\" (UniqueName: \"kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n\") pod \"nova-cell1-cell-mapping-xwqx2\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.642339 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.643487 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.780360 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn6mq\" (UniqueName: \"kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.780774 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.780825 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.780867 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.780935 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.781494 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.781552 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.781585 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd\") pod \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\" (UID: \"4a2874ba-472b-468f-9aa6-3a48320e2e0c\") " Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.782783 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.786313 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.791069 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq" (OuterVolumeSpecName: "kube-api-access-mn6mq") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "kube-api-access-mn6mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.830497 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts" (OuterVolumeSpecName: "scripts") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.852253 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.884487 4782 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.884521 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.884533 4782 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.884566 4782 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a2874ba-472b-468f-9aa6-3a48320e2e0c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.884575 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn6mq\" (UniqueName: \"kubernetes.io/projected/4a2874ba-472b-468f-9aa6-3a48320e2e0c-kube-api-access-mn6mq\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.910266 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.942360 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.957026 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data" (OuterVolumeSpecName: "config-data") pod "4a2874ba-472b-468f-9aa6-3a48320e2e0c" (UID: "4a2874ba-472b-468f-9aa6-3a48320e2e0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.986622 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.986658 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:04 crc kubenswrapper[4782]: I0130 18:52:04.986669 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2874ba-472b-468f-9aa6-3a48320e2e0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.015202 4782 generic.go:334] "Generic (PLEG): container finished" podID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerID="2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12" exitCode=0 Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.015266 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.015280 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerDied","Data":"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12"} Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.015303 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a2874ba-472b-468f-9aa6-3a48320e2e0c","Type":"ContainerDied","Data":"33fe1610cc458d96ea24ea204aa6c8a486d7d69121cd5c70930b0af5d112b3da"} Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.015319 4782 scope.go:117] "RemoveContainer" containerID="fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.020370 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" event={"ID":"30715f55-899e-47c8-a6f2-284ce89e38fa","Type":"ContainerDied","Data":"c7b6d9e8fd84ec60b526d2e239fe8ab74a0f267317a049747f9428c24c285d74"} Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.020400 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5dc4879-sfqkw" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.041083 4782 scope.go:117] "RemoveContainer" containerID="83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.071064 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.077460 4782 scope.go:117] "RemoveContainer" containerID="2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.084326 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5dc4879-sfqkw"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.092730 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.102735 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.108982 4782 scope.go:117] "RemoveContainer" containerID="8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.113445 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.113901 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="proxy-httpd" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.113913 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="proxy-httpd" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.113926 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-central-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.113932 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-central-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.113944 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="sg-core" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.113952 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="sg-core" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.113969 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-notification-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.113975 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-notification-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.114159 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-central-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.114180 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="sg-core" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.114190 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="ceilometer-notification-agent" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.114200 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" containerName="proxy-httpd" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.116081 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.118270 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.118421 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.118494 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.124380 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.184725 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xwqx2"] Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.189539 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.189975 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190028 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-config-data\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190083 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-log-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190102 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-run-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190216 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8767v\" (UniqueName: \"kubernetes.io/projected/1666e348-ad78-40db-be34-e66ea72a6af8-kube-api-access-8767v\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190269 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-scripts\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.190477 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.263372 4782 scope.go:117] "RemoveContainer" containerID="fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.268900 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac\": container with ID starting with fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac not found: ID does not exist" containerID="fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.268947 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac"} err="failed to get container status \"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac\": rpc error: code = NotFound desc = could not find container \"fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac\": container with ID starting with fdedbab3d634c009e4da9849b8c3df97c7986e6cf8377eca6a06be2ced3799ac not found: ID does not exist" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.268978 4782 scope.go:117] "RemoveContainer" containerID="83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.269463 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b\": container with ID starting with 83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b not found: ID does not exist" containerID="83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.269495 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b"} err="failed to get container status \"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b\": rpc error: code = NotFound desc = could not find container \"83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b\": container with ID starting with 83d0d1bba647d0713b98ff92a08522565460ee6596950c4c206b4320ca68cf0b not found: ID does not exist" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.269511 4782 scope.go:117] "RemoveContainer" containerID="2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.270811 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12\": container with ID starting with 2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12 not found: ID does not exist" containerID="2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.270849 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12"} err="failed to get container status \"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12\": rpc error: code = NotFound desc = could not find container \"2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12\": container with ID starting with 2d38b2b7baeedffd0b7b0d3c4dcac24f225cff191ba24c2fb1a273fa1a3cbf12 not found: ID does not exist" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.270867 4782 scope.go:117] "RemoveContainer" containerID="8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a" Jan 30 18:52:05 crc kubenswrapper[4782]: E0130 18:52:05.307428 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a\": container with ID starting with 8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a not found: ID does not exist" containerID="8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.307476 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a"} err="failed to get container status \"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a\": rpc error: code = NotFound desc = could not find container \"8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a\": container with ID starting with 8ccf5ba62ff26addd2791855a0e2be5cf4e7bdbe5185918a285d82b744789c6a not found: ID does not exist" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.307502 4782 scope.go:117] "RemoveContainer" containerID="3257eb30ca752fa948613865fd62ec290d05a9f28de6035143c2d1c5f3a1f875" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309686 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309735 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-config-data\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309797 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-log-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309822 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-run-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309851 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8767v\" (UniqueName: \"kubernetes.io/projected/1666e348-ad78-40db-be34-e66ea72a6af8-kube-api-access-8767v\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309874 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-scripts\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309923 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.309981 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.311136 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-run-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.326657 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1666e348-ad78-40db-be34-e66ea72a6af8-log-httpd\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.352219 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8767v\" (UniqueName: \"kubernetes.io/projected/1666e348-ad78-40db-be34-e66ea72a6af8-kube-api-access-8767v\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.352448 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-config-data\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.373818 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.380731 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.386115 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.390038 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1666e348-ad78-40db-be34-e66ea72a6af8-scripts\") pod \"ceilometer-0\" (UID: \"1666e348-ad78-40db-be34-e66ea72a6af8\") " pod="openstack/ceilometer-0" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.442606 4782 scope.go:117] "RemoveContainer" containerID="b5cfd805d47e6b6abbb85057b587f098120a4605886cfc397f56738b83d13023" Jan 30 18:52:05 crc kubenswrapper[4782]: I0130 18:52:05.552658 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 18:52:06 crc kubenswrapper[4782]: W0130 18:52:06.028152 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1666e348_ad78_40db_be34_e66ea72a6af8.slice/crio-c1fae9884d6a5d5215b1a3468ea7e533a4e52d8157e4b39b741d97b42158fb65 WatchSource:0}: Error finding container c1fae9884d6a5d5215b1a3468ea7e533a4e52d8157e4b39b741d97b42158fb65: Status 404 returned error can't find the container with id c1fae9884d6a5d5215b1a3468ea7e533a4e52d8157e4b39b741d97b42158fb65 Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.029443 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.032324 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xwqx2" event={"ID":"d18a53ee-3844-44b6-b8f2-149dd7b6f725","Type":"ContainerStarted","Data":"34149667919c372d1708d52f8f7aa8cc0e64355b75cd2e4670631352cbe443b4"} Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.032372 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xwqx2" event={"ID":"d18a53ee-3844-44b6-b8f2-149dd7b6f725","Type":"ContainerStarted","Data":"8c3f6aaf2b87d3c03aaddaea79c1979a23850842f802961680a958e36e38cca5"} Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.052541 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xwqx2" podStartSLOduration=2.052523591 podStartE2EDuration="2.052523591s" podCreationTimestamp="2026-01-30 18:52:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:52:06.047672321 +0000 UTC m=+1302.316050346" watchObservedRunningTime="2026-01-30 18:52:06.052523591 +0000 UTC m=+1302.320901616" Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.424948 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30715f55-899e-47c8-a6f2-284ce89e38fa" path="/var/lib/kubelet/pods/30715f55-899e-47c8-a6f2-284ce89e38fa/volumes" Jan 30 18:52:06 crc kubenswrapper[4782]: I0130 18:52:06.425895 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2874ba-472b-468f-9aa6-3a48320e2e0c" path="/var/lib/kubelet/pods/4a2874ba-472b-468f-9aa6-3a48320e2e0c/volumes" Jan 30 18:52:07 crc kubenswrapper[4782]: I0130 18:52:07.049272 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1666e348-ad78-40db-be34-e66ea72a6af8","Type":"ContainerStarted","Data":"c3790599fbd82ca7a82f2c1c5ca1f4cb786d3a2fdc59df6f3987ceb23ea3f405"} Jan 30 18:52:07 crc kubenswrapper[4782]: I0130 18:52:07.049617 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1666e348-ad78-40db-be34-e66ea72a6af8","Type":"ContainerStarted","Data":"765caaa254a80466104a65847a5324c63f55dd8c6c35a4b5904ae96ad49331ed"} Jan 30 18:52:07 crc kubenswrapper[4782]: I0130 18:52:07.049631 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1666e348-ad78-40db-be34-e66ea72a6af8","Type":"ContainerStarted","Data":"c1fae9884d6a5d5215b1a3468ea7e533a4e52d8157e4b39b741d97b42158fb65"} Jan 30 18:52:08 crc kubenswrapper[4782]: I0130 18:52:08.060294 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1666e348-ad78-40db-be34-e66ea72a6af8","Type":"ContainerStarted","Data":"13542445057d4ddf3f0ac470b7257d0501a2c7287ef0f7540010ea5629c62524"} Jan 30 18:52:10 crc kubenswrapper[4782]: I0130 18:52:10.078346 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1666e348-ad78-40db-be34-e66ea72a6af8","Type":"ContainerStarted","Data":"6653b8724a824adb339687beb5b856c950cccd82918735e26b5697b1600acaf4"} Jan 30 18:52:10 crc kubenswrapper[4782]: I0130 18:52:10.078971 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 18:52:10 crc kubenswrapper[4782]: I0130 18:52:10.111666 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.958223968 podStartE2EDuration="5.111645655s" podCreationTimestamp="2026-01-30 18:52:05 +0000 UTC" firstStartedPulling="2026-01-30 18:52:06.033420778 +0000 UTC m=+1302.301798803" lastFinishedPulling="2026-01-30 18:52:09.186842445 +0000 UTC m=+1305.455220490" observedRunningTime="2026-01-30 18:52:10.103684678 +0000 UTC m=+1306.372062703" watchObservedRunningTime="2026-01-30 18:52:10.111645655 +0000 UTC m=+1306.380023690" Jan 30 18:52:10 crc kubenswrapper[4782]: I0130 18:52:10.364466 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:52:10 crc kubenswrapper[4782]: I0130 18:52:10.364522 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.100482 4782 generic.go:334] "Generic (PLEG): container finished" podID="d18a53ee-3844-44b6-b8f2-149dd7b6f725" containerID="34149667919c372d1708d52f8f7aa8cc0e64355b75cd2e4670631352cbe443b4" exitCode=0 Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.100579 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xwqx2" event={"ID":"d18a53ee-3844-44b6-b8f2-149dd7b6f725","Type":"ContainerDied","Data":"34149667919c372d1708d52f8f7aa8cc0e64355b75cd2e4670631352cbe443b4"} Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.377376 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.377440 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.882478 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:11 crc kubenswrapper[4782]: I0130 18:52:11.943056 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.119275 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.461780 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.582088 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts\") pod \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.582394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle\") pod \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.584467 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mg2n\" (UniqueName: \"kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n\") pod \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.584762 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data\") pod \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\" (UID: \"d18a53ee-3844-44b6-b8f2-149dd7b6f725\") " Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.588392 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts" (OuterVolumeSpecName: "scripts") pod "d18a53ee-3844-44b6-b8f2-149dd7b6f725" (UID: "d18a53ee-3844-44b6-b8f2-149dd7b6f725"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.588862 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n" (OuterVolumeSpecName: "kube-api-access-9mg2n") pod "d18a53ee-3844-44b6-b8f2-149dd7b6f725" (UID: "d18a53ee-3844-44b6-b8f2-149dd7b6f725"). InnerVolumeSpecName "kube-api-access-9mg2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.616851 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data" (OuterVolumeSpecName: "config-data") pod "d18a53ee-3844-44b6-b8f2-149dd7b6f725" (UID: "d18a53ee-3844-44b6-b8f2-149dd7b6f725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.631460 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d18a53ee-3844-44b6-b8f2-149dd7b6f725" (UID: "d18a53ee-3844-44b6-b8f2-149dd7b6f725"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.686959 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mg2n\" (UniqueName: \"kubernetes.io/projected/d18a53ee-3844-44b6-b8f2-149dd7b6f725-kube-api-access-9mg2n\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.687136 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.687189 4782 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:12 crc kubenswrapper[4782]: I0130 18:52:12.687252 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d18a53ee-3844-44b6-b8f2-149dd7b6f725-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.120027 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-22nc8" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="registry-server" containerID="cri-o://33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d" gracePeriod=2 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.122449 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xwqx2" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.122635 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xwqx2" event={"ID":"d18a53ee-3844-44b6-b8f2-149dd7b6f725","Type":"ContainerDied","Data":"8c3f6aaf2b87d3c03aaddaea79c1979a23850842f802961680a958e36e38cca5"} Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.122711 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c3f6aaf2b87d3c03aaddaea79c1979a23850842f802961680a958e36e38cca5" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.370026 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.370315 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-log" containerID="cri-o://67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056" gracePeriod=30 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.370841 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-api" containerID="cri-o://63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101" gracePeriod=30 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.394995 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.395521 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2d111af9-ea7a-4826-a405-c14931dfd7b3" containerName="nova-scheduler-scheduler" containerID="cri-o://843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821" gracePeriod=30 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.441165 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.441678 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" containerID="cri-o://8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94" gracePeriod=30 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.442173 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" containerID="cri-o://04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80" gracePeriod=30 Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.744817 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.808196 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8b9\" (UniqueName: \"kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9\") pod \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.808448 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content\") pod \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.808548 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities\") pod \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\" (UID: \"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd\") " Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.809769 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities" (OuterVolumeSpecName: "utilities") pod "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" (UID: "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.834426 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9" (OuterVolumeSpecName: "kube-api-access-dz8b9") pod "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" (UID: "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd"). InnerVolumeSpecName "kube-api-access-dz8b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.910771 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.910804 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8b9\" (UniqueName: \"kubernetes.io/projected/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-kube-api-access-dz8b9\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:13 crc kubenswrapper[4782]: I0130 18:52:13.933577 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" (UID: "464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.016148 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.130772 4782 generic.go:334] "Generic (PLEG): container finished" podID="1950feda-6261-4e6d-8edd-26caa31998b4" containerID="8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94" exitCode=143 Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.130832 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerDied","Data":"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94"} Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.132841 4782 generic.go:334] "Generic (PLEG): container finished" podID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerID="33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d" exitCode=0 Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.132882 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerDied","Data":"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d"} Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.132899 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22nc8" event={"ID":"464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd","Type":"ContainerDied","Data":"fa9489977de2fc5ac94bb77980ac3bd9d58c0ba7fa91ba7cd13bf01ba96d8eff"} Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.132915 4782 scope.go:117] "RemoveContainer" containerID="33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.132938 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22nc8" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.135644 4782 generic.go:334] "Generic (PLEG): container finished" podID="e2610647-0a40-45d9-806c-6d1e737caf21" containerID="67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056" exitCode=143 Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.135682 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerDied","Data":"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056"} Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.160217 4782 scope.go:117] "RemoveContainer" containerID="b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.163571 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.172706 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-22nc8"] Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.189331 4782 scope.go:117] "RemoveContainer" containerID="3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.212590 4782 scope.go:117] "RemoveContainer" containerID="33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d" Jan 30 18:52:14 crc kubenswrapper[4782]: E0130 18:52:14.212916 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d\": container with ID starting with 33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d not found: ID does not exist" containerID="33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.212945 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d"} err="failed to get container status \"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d\": rpc error: code = NotFound desc = could not find container \"33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d\": container with ID starting with 33a70c976cd8f9e96eaf754ebfc2450d3425095e5bbac0a2cdbd52a638e6249d not found: ID does not exist" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.212967 4782 scope.go:117] "RemoveContainer" containerID="b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64" Jan 30 18:52:14 crc kubenswrapper[4782]: E0130 18:52:14.213134 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64\": container with ID starting with b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64 not found: ID does not exist" containerID="b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.213154 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64"} err="failed to get container status \"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64\": rpc error: code = NotFound desc = could not find container \"b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64\": container with ID starting with b310b3b904ab3f05664b4b3e99abb92450d9825314b165c7c0e305d2dec47d64 not found: ID does not exist" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.213166 4782 scope.go:117] "RemoveContainer" containerID="3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b" Jan 30 18:52:14 crc kubenswrapper[4782]: E0130 18:52:14.213336 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b\": container with ID starting with 3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b not found: ID does not exist" containerID="3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.213349 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b"} err="failed to get container status \"3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b\": rpc error: code = NotFound desc = could not find container \"3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b\": container with ID starting with 3d172e7f2c3a6fa3eb462500efb95aa7dae0159909acd1d966736e790c428e6b not found: ID does not exist" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.334802 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:44300->10.217.0.217:8775: read: connection reset by peer" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.334922 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:44288->10.217.0.217:8775: read: connection reset by peer" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.432415 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" path="/var/lib/kubelet/pods/464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd/volumes" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.808938 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.937517 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp57p\" (UniqueName: \"kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p\") pod \"1950feda-6261-4e6d-8edd-26caa31998b4\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.937596 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle\") pod \"1950feda-6261-4e6d-8edd-26caa31998b4\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.937626 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs\") pod \"1950feda-6261-4e6d-8edd-26caa31998b4\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.937811 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs\") pod \"1950feda-6261-4e6d-8edd-26caa31998b4\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.937906 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data\") pod \"1950feda-6261-4e6d-8edd-26caa31998b4\" (UID: \"1950feda-6261-4e6d-8edd-26caa31998b4\") " Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.938633 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs" (OuterVolumeSpecName: "logs") pod "1950feda-6261-4e6d-8edd-26caa31998b4" (UID: "1950feda-6261-4e6d-8edd-26caa31998b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.942970 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p" (OuterVolumeSpecName: "kube-api-access-fp57p") pod "1950feda-6261-4e6d-8edd-26caa31998b4" (UID: "1950feda-6261-4e6d-8edd-26caa31998b4"). InnerVolumeSpecName "kube-api-access-fp57p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.966374 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data" (OuterVolumeSpecName: "config-data") pod "1950feda-6261-4e6d-8edd-26caa31998b4" (UID: "1950feda-6261-4e6d-8edd-26caa31998b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:14 crc kubenswrapper[4782]: I0130 18:52:14.981407 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1950feda-6261-4e6d-8edd-26caa31998b4" (UID: "1950feda-6261-4e6d-8edd-26caa31998b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.020014 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1950feda-6261-4e6d-8edd-26caa31998b4" (UID: "1950feda-6261-4e6d-8edd-26caa31998b4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.040535 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1950feda-6261-4e6d-8edd-26caa31998b4-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.040737 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.040793 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp57p\" (UniqueName: \"kubernetes.io/projected/1950feda-6261-4e6d-8edd-26caa31998b4-kube-api-access-fp57p\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.040880 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.040938 4782 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1950feda-6261-4e6d-8edd-26caa31998b4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.145931 4782 generic.go:334] "Generic (PLEG): container finished" podID="1950feda-6261-4e6d-8edd-26caa31998b4" containerID="04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80" exitCode=0 Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.145982 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerDied","Data":"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80"} Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.146007 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1950feda-6261-4e6d-8edd-26caa31998b4","Type":"ContainerDied","Data":"699ddb65b6e83015efa8e31dd86fe8806365451f69d2ce2ea2194023a6ad0214"} Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.146022 4782 scope.go:117] "RemoveContainer" containerID="04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.146095 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.183730 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.188996 4782 scope.go:117] "RemoveContainer" containerID="8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.196627 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.223506 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224038 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224060 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224083 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224091 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224116 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="extract-content" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224124 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="extract-content" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224143 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="registry-server" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224150 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="registry-server" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224175 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18a53ee-3844-44b6-b8f2-149dd7b6f725" containerName="nova-manage" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224185 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18a53ee-3844-44b6-b8f2-149dd7b6f725" containerName="nova-manage" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.224200 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="extract-utilities" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224209 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="extract-utilities" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224439 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18a53ee-3844-44b6-b8f2-149dd7b6f725" containerName="nova-manage" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224465 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-metadata" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224479 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" containerName="nova-metadata-log" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.224507 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="464f5bf8-cf27-4e1e-aec8-cd3decdfb0dd" containerName="registry-server" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.225842 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.231463 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.231562 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.234632 4782 scope.go:117] "RemoveContainer" containerID="04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80" Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.238101 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80\": container with ID starting with 04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80 not found: ID does not exist" containerID="04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.238131 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80"} err="failed to get container status \"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80\": rpc error: code = NotFound desc = could not find container \"04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80\": container with ID starting with 04b16ee910672eba7b5a1ea4863d19a4a2c91ea4ccd43f2b5b440737c990fc80 not found: ID does not exist" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.238155 4782 scope.go:117] "RemoveContainer" containerID="8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.238346 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:15 crc kubenswrapper[4782]: E0130 18:52:15.240268 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94\": container with ID starting with 8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94 not found: ID does not exist" containerID="8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.240319 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94"} err="failed to get container status \"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94\": rpc error: code = NotFound desc = could not find container \"8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94\": container with ID starting with 8f429b6f01930afaa3809b0fbf45b35d261a16f6c330489c31bae4c6d1951b94 not found: ID does not exist" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.364651 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd2ef42-aeac-48dd-9e95-fd000381dbfa-logs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.364769 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.364802 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnb4\" (UniqueName: \"kubernetes.io/projected/efd2ef42-aeac-48dd-9e95-fd000381dbfa-kube-api-access-jdnb4\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.364833 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.364883 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-config-data\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.466660 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.466707 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnb4\" (UniqueName: \"kubernetes.io/projected/efd2ef42-aeac-48dd-9e95-fd000381dbfa-kube-api-access-jdnb4\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.466727 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.466764 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-config-data\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.466975 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd2ef42-aeac-48dd-9e95-fd000381dbfa-logs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.467692 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd2ef42-aeac-48dd-9e95-fd000381dbfa-logs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.472693 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.481744 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-config-data\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.482221 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd2ef42-aeac-48dd-9e95-fd000381dbfa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.485362 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnb4\" (UniqueName: \"kubernetes.io/projected/efd2ef42-aeac-48dd-9e95-fd000381dbfa-kube-api-access-jdnb4\") pod \"nova-metadata-0\" (UID: \"efd2ef42-aeac-48dd-9e95-fd000381dbfa\") " pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.541357 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.689944 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.775036 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data\") pod \"2d111af9-ea7a-4826-a405-c14931dfd7b3\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.775313 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngqxs\" (UniqueName: \"kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs\") pod \"2d111af9-ea7a-4826-a405-c14931dfd7b3\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.775368 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle\") pod \"2d111af9-ea7a-4826-a405-c14931dfd7b3\" (UID: \"2d111af9-ea7a-4826-a405-c14931dfd7b3\") " Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.779366 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs" (OuterVolumeSpecName: "kube-api-access-ngqxs") pod "2d111af9-ea7a-4826-a405-c14931dfd7b3" (UID: "2d111af9-ea7a-4826-a405-c14931dfd7b3"). InnerVolumeSpecName "kube-api-access-ngqxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.806101 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data" (OuterVolumeSpecName: "config-data") pod "2d111af9-ea7a-4826-a405-c14931dfd7b3" (UID: "2d111af9-ea7a-4826-a405-c14931dfd7b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.822956 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d111af9-ea7a-4826-a405-c14931dfd7b3" (UID: "2d111af9-ea7a-4826-a405-c14931dfd7b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.878069 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngqxs\" (UniqueName: \"kubernetes.io/projected/2d111af9-ea7a-4826-a405-c14931dfd7b3-kube-api-access-ngqxs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.878112 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.878122 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d111af9-ea7a-4826-a405-c14931dfd7b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:15 crc kubenswrapper[4782]: I0130 18:52:15.996892 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 18:52:16 crc kubenswrapper[4782]: W0130 18:52:16.002267 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefd2ef42_aeac_48dd_9e95_fd000381dbfa.slice/crio-ffe26c1596fe7d6c82190c4807b20e5022d75ac95051d7abaff0e78ea8c018ae WatchSource:0}: Error finding container ffe26c1596fe7d6c82190c4807b20e5022d75ac95051d7abaff0e78ea8c018ae: Status 404 returned error can't find the container with id ffe26c1596fe7d6c82190c4807b20e5022d75ac95051d7abaff0e78ea8c018ae Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.165967 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"efd2ef42-aeac-48dd-9e95-fd000381dbfa","Type":"ContainerStarted","Data":"8e4985dfe654d69fa3e97889663bb60c2625ce3005e125a2258aaed06438ca07"} Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.166007 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"efd2ef42-aeac-48dd-9e95-fd000381dbfa","Type":"ContainerStarted","Data":"ffe26c1596fe7d6c82190c4807b20e5022d75ac95051d7abaff0e78ea8c018ae"} Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.167728 4782 generic.go:334] "Generic (PLEG): container finished" podID="2d111af9-ea7a-4826-a405-c14931dfd7b3" containerID="843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821" exitCode=0 Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.167827 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.167852 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d111af9-ea7a-4826-a405-c14931dfd7b3","Type":"ContainerDied","Data":"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821"} Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.167925 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d111af9-ea7a-4826-a405-c14931dfd7b3","Type":"ContainerDied","Data":"311123e9c69aff50930999dea4c489db12185db8fce1de11277bd0886913645c"} Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.167960 4782 scope.go:117] "RemoveContainer" containerID="843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.199823 4782 scope.go:117] "RemoveContainer" containerID="843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821" Jan 30 18:52:16 crc kubenswrapper[4782]: E0130 18:52:16.202873 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821\": container with ID starting with 843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821 not found: ID does not exist" containerID="843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.202954 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821"} err="failed to get container status \"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821\": rpc error: code = NotFound desc = could not find container \"843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821\": container with ID starting with 843bac6307f8f3e5ff46d6068afcdcae68a76110506a12aae26e513a05355821 not found: ID does not exist" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.239915 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.261282 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.292431 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:16 crc kubenswrapper[4782]: E0130 18:52:16.293174 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d111af9-ea7a-4826-a405-c14931dfd7b3" containerName="nova-scheduler-scheduler" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.293198 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d111af9-ea7a-4826-a405-c14931dfd7b3" containerName="nova-scheduler-scheduler" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.293490 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d111af9-ea7a-4826-a405-c14931dfd7b3" containerName="nova-scheduler-scheduler" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.294525 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.297211 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.321319 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.387393 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-config-data\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.387604 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.387653 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6t4f\" (UniqueName: \"kubernetes.io/projected/1caadc78-c45b-4e64-ae44-a6f96bb41126-kube-api-access-k6t4f\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.432399 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1950feda-6261-4e6d-8edd-26caa31998b4" path="/var/lib/kubelet/pods/1950feda-6261-4e6d-8edd-26caa31998b4/volumes" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.433187 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d111af9-ea7a-4826-a405-c14931dfd7b3" path="/var/lib/kubelet/pods/2d111af9-ea7a-4826-a405-c14931dfd7b3/volumes" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.490111 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.490170 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6t4f\" (UniqueName: \"kubernetes.io/projected/1caadc78-c45b-4e64-ae44-a6f96bb41126-kube-api-access-k6t4f\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.490291 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-config-data\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.506078 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-config-data\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.506886 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1caadc78-c45b-4e64-ae44-a6f96bb41126-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.511107 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6t4f\" (UniqueName: \"kubernetes.io/projected/1caadc78-c45b-4e64-ae44-a6f96bb41126-kube-api-access-k6t4f\") pod \"nova-scheduler-0\" (UID: \"1caadc78-c45b-4e64-ae44-a6f96bb41126\") " pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.690755 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.703640 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.799931 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.800186 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.800447 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2qdg\" (UniqueName: \"kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.801335 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.801363 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.801398 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs\") pod \"e2610647-0a40-45d9-806c-6d1e737caf21\" (UID: \"e2610647-0a40-45d9-806c-6d1e737caf21\") " Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.802556 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs" (OuterVolumeSpecName: "logs") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.825185 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg" (OuterVolumeSpecName: "kube-api-access-s2qdg") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "kube-api-access-s2qdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.862348 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data" (OuterVolumeSpecName: "config-data") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.902438 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.903541 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2qdg\" (UniqueName: \"kubernetes.io/projected/e2610647-0a40-45d9-806c-6d1e737caf21-kube-api-access-s2qdg\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.903564 4782 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2610647-0a40-45d9-806c-6d1e737caf21-logs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.903574 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.903585 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.919840 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:16 crc kubenswrapper[4782]: I0130 18:52:16.951386 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e2610647-0a40-45d9-806c-6d1e737caf21" (UID: "e2610647-0a40-45d9-806c-6d1e737caf21"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.014471 4782 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.014503 4782 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2610647-0a40-45d9-806c-6d1e737caf21-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.039898 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 18:52:17 crc kubenswrapper[4782]: W0130 18:52:17.040942 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1caadc78_c45b_4e64_ae44_a6f96bb41126.slice/crio-a0b686a04f4707dcea5116b3d15ce7a9e69a8c7e741b9f77f4ed8af3f2182c80 WatchSource:0}: Error finding container a0b686a04f4707dcea5116b3d15ce7a9e69a8c7e741b9f77f4ed8af3f2182c80: Status 404 returned error can't find the container with id a0b686a04f4707dcea5116b3d15ce7a9e69a8c7e741b9f77f4ed8af3f2182c80 Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.192277 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1caadc78-c45b-4e64-ae44-a6f96bb41126","Type":"ContainerStarted","Data":"a0b686a04f4707dcea5116b3d15ce7a9e69a8c7e741b9f77f4ed8af3f2182c80"} Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.195132 4782 generic.go:334] "Generic (PLEG): container finished" podID="e2610647-0a40-45d9-806c-6d1e737caf21" containerID="63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101" exitCode=0 Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.195179 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerDied","Data":"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101"} Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.195199 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2610647-0a40-45d9-806c-6d1e737caf21","Type":"ContainerDied","Data":"80ff495a02e375c74b0928e8cebc3494da7d503df3134ff243f4f9919850283e"} Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.195215 4782 scope.go:117] "RemoveContainer" containerID="63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.195378 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.197779 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"efd2ef42-aeac-48dd-9e95-fd000381dbfa","Type":"ContainerStarted","Data":"805c03019e30b58bdabf44a7cb87c1caf485173a9d43c02f69dc0dd49d05a0e0"} Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.222675 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.222658056 podStartE2EDuration="2.222658056s" podCreationTimestamp="2026-01-30 18:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:52:17.217104249 +0000 UTC m=+1313.485482274" watchObservedRunningTime="2026-01-30 18:52:17.222658056 +0000 UTC m=+1313.491036081" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.238647 4782 scope.go:117] "RemoveContainer" containerID="67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.257331 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.269178 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.271672 4782 scope.go:117] "RemoveContainer" containerID="63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101" Jan 30 18:52:17 crc kubenswrapper[4782]: E0130 18:52:17.272175 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101\": container with ID starting with 63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101 not found: ID does not exist" containerID="63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.272215 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101"} err="failed to get container status \"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101\": rpc error: code = NotFound desc = could not find container \"63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101\": container with ID starting with 63a4add7dcfec6caf036d527c25539911cfc6fb39cd0e24fed9e385283c0d101 not found: ID does not exist" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.272271 4782 scope.go:117] "RemoveContainer" containerID="67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056" Jan 30 18:52:17 crc kubenswrapper[4782]: E0130 18:52:17.272527 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056\": container with ID starting with 67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056 not found: ID does not exist" containerID="67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.272573 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056"} err="failed to get container status \"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056\": rpc error: code = NotFound desc = could not find container \"67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056\": container with ID starting with 67251b33425581c2e01c0e0b7e8457f087d163b3002fa684027e34496a569056 not found: ID does not exist" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.279130 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:17 crc kubenswrapper[4782]: E0130 18:52:17.279702 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-log" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.279719 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-log" Jan 30 18:52:17 crc kubenswrapper[4782]: E0130 18:52:17.279744 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-api" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.279757 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-api" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.280095 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-log" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.280133 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" containerName="nova-api-api" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.281510 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.284629 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.284748 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.284813 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.288054 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423689 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnpl\" (UniqueName: \"kubernetes.io/projected/20743691-4aeb-4b01-a442-5df58c830c02-kube-api-access-cwnpl\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423770 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-config-data\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423841 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423876 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423940 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-public-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.423978 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20743691-4aeb-4b01-a442-5df58c830c02-logs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525174 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20743691-4aeb-4b01-a442-5df58c830c02-logs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525385 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnpl\" (UniqueName: \"kubernetes.io/projected/20743691-4aeb-4b01-a442-5df58c830c02-kube-api-access-cwnpl\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525457 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-config-data\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525496 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525526 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.525585 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-public-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.527342 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20743691-4aeb-4b01-a442-5df58c830c02-logs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.532668 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.534143 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-public-tls-certs\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.534921 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.535439 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20743691-4aeb-4b01-a442-5df58c830c02-config-data\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.547680 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnpl\" (UniqueName: \"kubernetes.io/projected/20743691-4aeb-4b01-a442-5df58c830c02-kube-api-access-cwnpl\") pod \"nova-api-0\" (UID: \"20743691-4aeb-4b01-a442-5df58c830c02\") " pod="openstack/nova-api-0" Jan 30 18:52:17 crc kubenswrapper[4782]: I0130 18:52:17.619994 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 18:52:18 crc kubenswrapper[4782]: I0130 18:52:18.126720 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 18:52:18 crc kubenswrapper[4782]: W0130 18:52:18.141256 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20743691_4aeb_4b01_a442_5df58c830c02.slice/crio-f08c61f4136f6f6b704730f0d7c3decadec7ae2d88e85973445c4458abba3050 WatchSource:0}: Error finding container f08c61f4136f6f6b704730f0d7c3decadec7ae2d88e85973445c4458abba3050: Status 404 returned error can't find the container with id f08c61f4136f6f6b704730f0d7c3decadec7ae2d88e85973445c4458abba3050 Jan 30 18:52:18 crc kubenswrapper[4782]: I0130 18:52:18.214763 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20743691-4aeb-4b01-a442-5df58c830c02","Type":"ContainerStarted","Data":"f08c61f4136f6f6b704730f0d7c3decadec7ae2d88e85973445c4458abba3050"} Jan 30 18:52:18 crc kubenswrapper[4782]: I0130 18:52:18.218395 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1caadc78-c45b-4e64-ae44-a6f96bb41126","Type":"ContainerStarted","Data":"aa5cb9a9a99926b68c111599961114cbcaf6e557fc05d406a6ee78e36b11824b"} Jan 30 18:52:18 crc kubenswrapper[4782]: I0130 18:52:18.243338 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.243318574 podStartE2EDuration="2.243318574s" podCreationTimestamp="2026-01-30 18:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:52:18.233685006 +0000 UTC m=+1314.502063051" watchObservedRunningTime="2026-01-30 18:52:18.243318574 +0000 UTC m=+1314.511696599" Jan 30 18:52:18 crc kubenswrapper[4782]: I0130 18:52:18.457425 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2610647-0a40-45d9-806c-6d1e737caf21" path="/var/lib/kubelet/pods/e2610647-0a40-45d9-806c-6d1e737caf21/volumes" Jan 30 18:52:19 crc kubenswrapper[4782]: I0130 18:52:19.229538 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20743691-4aeb-4b01-a442-5df58c830c02","Type":"ContainerStarted","Data":"be5ad5fe7f88135ae5c581805b771dc21f1ad293c4dd7c0a6b47913d834eed32"} Jan 30 18:52:19 crc kubenswrapper[4782]: I0130 18:52:19.229837 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20743691-4aeb-4b01-a442-5df58c830c02","Type":"ContainerStarted","Data":"5c2e91894dd7e26d5514343971b398ba9e8cc3c8e80c6acabc1f841e04fb7208"} Jan 30 18:52:19 crc kubenswrapper[4782]: I0130 18:52:19.258826 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.258801796 podStartE2EDuration="2.258801796s" podCreationTimestamp="2026-01-30 18:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:52:19.254414397 +0000 UTC m=+1315.522792432" watchObservedRunningTime="2026-01-30 18:52:19.258801796 +0000 UTC m=+1315.527179861" Jan 30 18:52:20 crc kubenswrapper[4782]: I0130 18:52:20.541869 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:52:20 crc kubenswrapper[4782]: I0130 18:52:20.542702 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 18:52:21 crc kubenswrapper[4782]: I0130 18:52:21.691951 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 18:52:25 crc kubenswrapper[4782]: I0130 18:52:25.542543 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 18:52:25 crc kubenswrapper[4782]: I0130 18:52:25.543797 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 18:52:26 crc kubenswrapper[4782]: I0130 18:52:26.556364 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="efd2ef42-aeac-48dd-9e95-fd000381dbfa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:26 crc kubenswrapper[4782]: I0130 18:52:26.556369 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="efd2ef42-aeac-48dd-9e95-fd000381dbfa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.228:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:26 crc kubenswrapper[4782]: I0130 18:52:26.692493 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 18:52:26 crc kubenswrapper[4782]: I0130 18:52:26.733077 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 18:52:27 crc kubenswrapper[4782]: I0130 18:52:27.359256 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 18:52:27 crc kubenswrapper[4782]: I0130 18:52:27.621317 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:52:27 crc kubenswrapper[4782]: I0130 18:52:27.621367 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 18:52:28 crc kubenswrapper[4782]: I0130 18:52:28.634438 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20743691-4aeb-4b01-a442-5df58c830c02" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:28 crc kubenswrapper[4782]: I0130 18:52:28.634463 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20743691-4aeb-4b01-a442-5df58c830c02" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 18:52:35 crc kubenswrapper[4782]: I0130 18:52:35.550339 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 18:52:35 crc kubenswrapper[4782]: I0130 18:52:35.551000 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 18:52:35 crc kubenswrapper[4782]: I0130 18:52:35.560208 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 18:52:35 crc kubenswrapper[4782]: I0130 18:52:35.571478 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 18:52:36 crc kubenswrapper[4782]: I0130 18:52:36.429969 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 18:52:37 crc kubenswrapper[4782]: I0130 18:52:37.633145 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 18:52:37 crc kubenswrapper[4782]: I0130 18:52:37.633576 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 18:52:37 crc kubenswrapper[4782]: I0130 18:52:37.640368 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 18:52:37 crc kubenswrapper[4782]: I0130 18:52:37.661575 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 18:52:38 crc kubenswrapper[4782]: I0130 18:52:38.438924 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 18:52:38 crc kubenswrapper[4782]: I0130 18:52:38.453721 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 18:52:46 crc kubenswrapper[4782]: I0130 18:52:46.128487 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:47 crc kubenswrapper[4782]: I0130 18:52:47.261072 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:49 crc kubenswrapper[4782]: I0130 18:52:49.554860 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="rabbitmq" containerID="cri-o://598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f" gracePeriod=604797 Jan 30 18:52:50 crc kubenswrapper[4782]: I0130 18:52:50.644464 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="rabbitmq" containerID="cri-o://72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b" gracePeriod=604797 Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.161345 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219660 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh762\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219788 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219835 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219895 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219930 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.219978 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220070 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220122 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220154 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220189 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220323 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret\") pod \"3ef56512-bf17-45df-9e3d-ff2e97f66252\" (UID: \"3ef56512-bf17-45df-9e3d-ff2e97f66252\") " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.220917 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.221579 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.222614 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.227894 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.231641 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info" (OuterVolumeSpecName: "pod-info") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.238600 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.239153 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762" (OuterVolumeSpecName: "kube-api-access-xh762") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "kube-api-access-xh762". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.243859 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.322938 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh762\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-kube-api-access-xh762\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.322964 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.322973 4782 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.322982 4782 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3ef56512-bf17-45df-9e3d-ff2e97f66252-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.322990 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.323010 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.323019 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.323029 4782 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3ef56512-bf17-45df-9e3d-ff2e97f66252-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.323775 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf" (OuterVolumeSpecName: "server-conf") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.353699 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data" (OuterVolumeSpecName: "config-data") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.362822 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.426062 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.426093 4782 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3ef56512-bf17-45df-9e3d-ff2e97f66252-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.426102 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.474215 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3ef56512-bf17-45df-9e3d-ff2e97f66252" (UID: "3ef56512-bf17-45df-9e3d-ff2e97f66252"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.527505 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3ef56512-bf17-45df-9e3d-ff2e97f66252-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.580003 4782 generic.go:334] "Generic (PLEG): container finished" podID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerID="598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f" exitCode=0 Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.580045 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerDied","Data":"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f"} Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.580077 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3ef56512-bf17-45df-9e3d-ff2e97f66252","Type":"ContainerDied","Data":"c52c50b15dfecad3a0fcfbfc767bf94be310f485b841e8b350aa05b73c636151"} Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.580078 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.580094 4782 scope.go:117] "RemoveContainer" containerID="598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.606969 4782 scope.go:117] "RemoveContainer" containerID="f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.652664 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.654709 4782 scope.go:117] "RemoveContainer" containerID="598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f" Jan 30 18:52:51 crc kubenswrapper[4782]: E0130 18:52:51.655345 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f\": container with ID starting with 598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f not found: ID does not exist" containerID="598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.655378 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f"} err="failed to get container status \"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f\": rpc error: code = NotFound desc = could not find container \"598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f\": container with ID starting with 598b5a4ab725da4ca5a5ecb8e3ddea58cfd7e5d346830630ef82209a6494f82f not found: ID does not exist" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.655400 4782 scope.go:117] "RemoveContainer" containerID="f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18" Jan 30 18:52:51 crc kubenswrapper[4782]: E0130 18:52:51.656611 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18\": container with ID starting with f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18 not found: ID does not exist" containerID="f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.656661 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18"} err="failed to get container status \"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18\": rpc error: code = NotFound desc = could not find container \"f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18\": container with ID starting with f07b5cf2203d393e24806596e4dc2c7ca81754e144adca25662c97db42dd8b18 not found: ID does not exist" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.670353 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.732129 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:51 crc kubenswrapper[4782]: E0130 18:52:51.732657 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="setup-container" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.732683 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="setup-container" Jan 30 18:52:51 crc kubenswrapper[4782]: E0130 18:52:51.732720 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="rabbitmq" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.732731 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="rabbitmq" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.732956 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" containerName="rabbitmq" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.741360 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.743941 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.744146 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.744375 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.744511 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.744658 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.744991 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.745100 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mtrhn" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.752192 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.833774 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.833850 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.833958 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834017 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcjg\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-kube-api-access-wkcjg\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834071 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834104 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834179 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834264 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834355 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834405 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.834447 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.935954 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936016 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936042 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936065 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936109 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936142 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936167 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936187 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcjg\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-kube-api-access-wkcjg\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936210 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936242 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.936272 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.937354 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.937544 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.937640 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.938328 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-config-data\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.938483 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.941049 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.943886 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.945611 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.949837 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.956657 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:51 crc kubenswrapper[4782]: I0130 18:52:51.962680 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcjg\" (UniqueName: \"kubernetes.io/projected/f74ddec0-3f55-44e4-80f4-2d4eac7a9093-kube-api-access-wkcjg\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.011803 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"f74ddec0-3f55-44e4-80f4-2d4eac7a9093\") " pod="openstack/rabbitmq-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.060046 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.182073 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241022 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241072 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241093 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241167 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgf22\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241200 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241272 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241311 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241339 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241516 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.241553 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie\") pod \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\" (UID: \"30ded24a-ee08-4d96-80f1-3d5793ec76bb\") " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.243074 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.243999 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.244393 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.246311 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.246931 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info" (OuterVolumeSpecName: "pod-info") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.247179 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.249356 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.250994 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22" (OuterVolumeSpecName: "kube-api-access-wgf22") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "kube-api-access-wgf22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.293478 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data" (OuterVolumeSpecName: "config-data") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344412 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgf22\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-kube-api-access-wgf22\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344442 4782 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30ded24a-ee08-4d96-80f1-3d5793ec76bb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344452 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344460 4782 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30ded24a-ee08-4d96-80f1-3d5793ec76bb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344480 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344489 4782 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344497 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344507 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.344516 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.409842 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf" (OuterVolumeSpecName: "server-conf") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.446579 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.484183 4782 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30ded24a-ee08-4d96-80f1-3d5793ec76bb-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.484257 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.521865 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "30ded24a-ee08-4d96-80f1-3d5793ec76bb" (UID: "30ded24a-ee08-4d96-80f1-3d5793ec76bb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.522154 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef56512-bf17-45df-9e3d-ff2e97f66252" path="/var/lib/kubelet/pods/3ef56512-bf17-45df-9e3d-ff2e97f66252/volumes" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.603855 4782 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30ded24a-ee08-4d96-80f1-3d5793ec76bb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.617744 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:52:52 crc kubenswrapper[4782]: E0130 18:52:52.618128 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="rabbitmq" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.618142 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="rabbitmq" Jan 30 18:52:52 crc kubenswrapper[4782]: E0130 18:52:52.618164 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="setup-container" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.618171 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="setup-container" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.618356 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerName="rabbitmq" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619249 4782 generic.go:334] "Generic (PLEG): container finished" podID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" containerID="72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b" exitCode=0 Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619349 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619625 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerDied","Data":"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b"} Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619655 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"30ded24a-ee08-4d96-80f1-3d5793ec76bb","Type":"ContainerDied","Data":"7821be7ff6e627cd88e32c30aa83967e8a1a219ccd02591580a69f0aa6d46749"} Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619672 4782 scope.go:117] "RemoveContainer" containerID="72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.619791 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.655782 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.704479 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.705849 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gg8\" (UniqueName: \"kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.705901 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.706729 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.715815 4782 scope.go:117] "RemoveContainer" containerID="4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.730360 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.745703 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.755407 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.757544 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762178 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8grjl" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762420 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762522 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762619 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762825 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.762922 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.763021 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.763686 4782 scope.go:117] "RemoveContainer" containerID="72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.764987 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:52 crc kubenswrapper[4782]: E0130 18:52:52.765125 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b\": container with ID starting with 72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b not found: ID does not exist" containerID="72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.765152 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b"} err="failed to get container status \"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b\": rpc error: code = NotFound desc = could not find container \"72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b\": container with ID starting with 72a6e5a3b6cd0ed0eb96d727e8b2edcb1e9575ce8247d166f2f4e4f16e161f5b not found: ID does not exist" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.765178 4782 scope.go:117] "RemoveContainer" containerID="4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521" Jan 30 18:52:52 crc kubenswrapper[4782]: E0130 18:52:52.766958 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521\": container with ID starting with 4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521 not found: ID does not exist" containerID="4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.766975 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521"} err="failed to get container status \"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521\": rpc error: code = NotFound desc = could not find container \"4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521\": container with ID starting with 4933247d8258b2c220ef00b788ae117108f4c5cf86c2a861c375944d9fc32521 not found: ID does not exist" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.811863 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.811950 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gg8\" (UniqueName: \"kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.811974 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.812436 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.814723 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.837514 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gg8\" (UniqueName: \"kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8\") pod \"community-operators-544h2\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913135 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913178 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913221 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913253 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913291 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75a1a86d-bec9-47a8-9031-21a30029c09d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75a1a86d-bec9-47a8-9031-21a30029c09d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913350 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913387 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cmpt\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-kube-api-access-9cmpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913433 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.913449 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:52 crc kubenswrapper[4782]: I0130 18:52:52.976748 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017053 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017099 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017146 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017163 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017270 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75a1a86d-bec9-47a8-9031-21a30029c09d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017290 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75a1a86d-bec9-47a8-9031-21a30029c09d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017341 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017380 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cmpt\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-kube-api-access-9cmpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017401 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017438 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017459 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.017511 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.018031 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.018108 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.018184 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.018446 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.023162 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75a1a86d-bec9-47a8-9031-21a30029c09d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.032324 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.035610 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75a1a86d-bec9-47a8-9031-21a30029c09d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.038605 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75a1a86d-bec9-47a8-9031-21a30029c09d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.039346 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.040352 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cmpt\" (UniqueName: \"kubernetes.io/projected/75a1a86d-bec9-47a8-9031-21a30029c09d-kube-api-access-9cmpt\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.066385 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"75a1a86d-bec9-47a8-9031-21a30029c09d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.259088 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.621876 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:52:53 crc kubenswrapper[4782]: W0130 18:52:53.623444 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d30d28e_c223_4a6c_bae9_be80a2cca85a.slice/crio-df49e7190b20d5974ad5006bcdd6d2f13d7e87ebf98d35e727cfbe0cb1bc36dd WatchSource:0}: Error finding container df49e7190b20d5974ad5006bcdd6d2f13d7e87ebf98d35e727cfbe0cb1bc36dd: Status 404 returned error can't find the container with id df49e7190b20d5974ad5006bcdd6d2f13d7e87ebf98d35e727cfbe0cb1bc36dd Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.631775 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74ddec0-3f55-44e4-80f4-2d4eac7a9093","Type":"ContainerStarted","Data":"51ea08df0c32020e4ab0f111279ec7aaaccf0052683f306946808136c224fbb4"} Jan 30 18:52:53 crc kubenswrapper[4782]: W0130 18:52:53.774663 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a1a86d_bec9_47a8_9031_21a30029c09d.slice/crio-3b8c507a2fcf69fd69dc9638d29cea514f9be88473b6ff4f58c63176742c683d WatchSource:0}: Error finding container 3b8c507a2fcf69fd69dc9638d29cea514f9be88473b6ff4f58c63176742c683d: Status 404 returned error can't find the container with id 3b8c507a2fcf69fd69dc9638d29cea514f9be88473b6ff4f58c63176742c683d Jan 30 18:52:53 crc kubenswrapper[4782]: I0130 18:52:53.802502 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.429898 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ded24a-ee08-4d96-80f1-3d5793ec76bb" path="/var/lib/kubelet/pods/30ded24a-ee08-4d96-80f1-3d5793ec76bb/volumes" Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.641805 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75a1a86d-bec9-47a8-9031-21a30029c09d","Type":"ContainerStarted","Data":"3b8c507a2fcf69fd69dc9638d29cea514f9be88473b6ff4f58c63176742c683d"} Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.643172 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74ddec0-3f55-44e4-80f4-2d4eac7a9093","Type":"ContainerStarted","Data":"4aad274c793bcc5f17bed7e6b6bdf7ce21dbf406b5f3ec0569dff06015c0c081"} Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.646293 4782 generic.go:334] "Generic (PLEG): container finished" podID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerID="2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2" exitCode=0 Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.646349 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerDied","Data":"2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2"} Jan 30 18:52:54 crc kubenswrapper[4782]: I0130 18:52:54.646380 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerStarted","Data":"df49e7190b20d5974ad5006bcdd6d2f13d7e87ebf98d35e727cfbe0cb1bc36dd"} Jan 30 18:52:56 crc kubenswrapper[4782]: I0130 18:52:56.666015 4782 generic.go:334] "Generic (PLEG): container finished" podID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerID="e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e" exitCode=0 Jan 30 18:52:56 crc kubenswrapper[4782]: I0130 18:52:56.666065 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerDied","Data":"e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e"} Jan 30 18:52:56 crc kubenswrapper[4782]: I0130 18:52:56.668152 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75a1a86d-bec9-47a8-9031-21a30029c09d","Type":"ContainerStarted","Data":"0d2ee4189ebab08be8901b6d359168fe0ab8b73ab9d446b6f7a1f90b5187e63a"} Jan 30 18:52:57 crc kubenswrapper[4782]: I0130 18:52:57.682300 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerStarted","Data":"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2"} Jan 30 18:52:57 crc kubenswrapper[4782]: I0130 18:52:57.706750 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-544h2" podStartSLOduration=3.2708834319999998 podStartE2EDuration="5.706733332s" podCreationTimestamp="2026-01-30 18:52:52 +0000 UTC" firstStartedPulling="2026-01-30 18:52:54.647666868 +0000 UTC m=+1350.916044893" lastFinishedPulling="2026-01-30 18:52:57.083516728 +0000 UTC m=+1353.351894793" observedRunningTime="2026-01-30 18:52:57.704029405 +0000 UTC m=+1353.972407460" watchObservedRunningTime="2026-01-30 18:52:57.706733332 +0000 UTC m=+1353.975111357" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.823142 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.825032 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.827272 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.834197 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953141 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9cnz\" (UniqueName: \"kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953323 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953556 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953578 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953764 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953830 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:01 crc kubenswrapper[4782]: I0130 18:53:01.953872 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055674 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055722 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055758 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055784 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055802 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055890 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9cnz\" (UniqueName: \"kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.055933 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.056820 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.057341 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.057840 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.058769 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.059214 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.059335 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.077198 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9cnz\" (UniqueName: \"kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz\") pod \"dnsmasq-dns-777789c5ff-hg2f5\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.143537 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.729589 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.977862 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:02 crc kubenswrapper[4782]: I0130 18:53:02.978211 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.025753 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.744294 4782 generic.go:334] "Generic (PLEG): container finished" podID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerID="7cf6db13630a93533027ca71018730402aa9307a89805504a76ed0bc7d8b3b93" exitCode=0 Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.744428 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" event={"ID":"6c39e5c5-b835-4071-9daa-1b116dc37a02","Type":"ContainerDied","Data":"7cf6db13630a93533027ca71018730402aa9307a89805504a76ed0bc7d8b3b93"} Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.744698 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" event={"ID":"6c39e5c5-b835-4071-9daa-1b116dc37a02","Type":"ContainerStarted","Data":"e884195bb716ae468af07892aa8a901cd229c7ce18b71ea3eb22b60960d00d84"} Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.885412 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:03 crc kubenswrapper[4782]: I0130 18:53:03.978194 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:53:04 crc kubenswrapper[4782]: I0130 18:53:04.759488 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" event={"ID":"6c39e5c5-b835-4071-9daa-1b116dc37a02","Type":"ContainerStarted","Data":"3741183bd6fd9b04a448be57660fde5c960db5e7458e6953067f8c2e0205314f"} Jan 30 18:53:04 crc kubenswrapper[4782]: I0130 18:53:04.789888 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" podStartSLOduration=3.789868734 podStartE2EDuration="3.789868734s" podCreationTimestamp="2026-01-30 18:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:53:04.789844074 +0000 UTC m=+1361.058222089" watchObservedRunningTime="2026-01-30 18:53:04.789868734 +0000 UTC m=+1361.058246759" Jan 30 18:53:05 crc kubenswrapper[4782]: I0130 18:53:05.770398 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-544h2" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="registry-server" containerID="cri-o://83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2" gracePeriod=2 Jan 30 18:53:05 crc kubenswrapper[4782]: I0130 18:53:05.770445 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.354387 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.461932 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities\") pod \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.462132 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84gg8\" (UniqueName: \"kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8\") pod \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.462269 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content\") pod \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\" (UID: \"2d30d28e-c223-4a6c-bae9-be80a2cca85a\") " Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.463027 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities" (OuterVolumeSpecName: "utilities") pod "2d30d28e-c223-4a6c-bae9-be80a2cca85a" (UID: "2d30d28e-c223-4a6c-bae9-be80a2cca85a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.467605 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8" (OuterVolumeSpecName: "kube-api-access-84gg8") pod "2d30d28e-c223-4a6c-bae9-be80a2cca85a" (UID: "2d30d28e-c223-4a6c-bae9-be80a2cca85a"). InnerVolumeSpecName "kube-api-access-84gg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.564779 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84gg8\" (UniqueName: \"kubernetes.io/projected/2d30d28e-c223-4a6c-bae9-be80a2cca85a-kube-api-access-84gg8\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.564813 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.784449 4782 generic.go:334] "Generic (PLEG): container finished" podID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerID="83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2" exitCode=0 Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.784507 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerDied","Data":"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2"} Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.784574 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-544h2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.784604 4782 scope.go:117] "RemoveContainer" containerID="83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.784584 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-544h2" event={"ID":"2d30d28e-c223-4a6c-bae9-be80a2cca85a","Type":"ContainerDied","Data":"df49e7190b20d5974ad5006bcdd6d2f13d7e87ebf98d35e727cfbe0cb1bc36dd"} Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.812316 4782 scope.go:117] "RemoveContainer" containerID="e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.849152 4782 scope.go:117] "RemoveContainer" containerID="2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.908812 4782 scope.go:117] "RemoveContainer" containerID="83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2" Jan 30 18:53:06 crc kubenswrapper[4782]: E0130 18:53:06.909685 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2\": container with ID starting with 83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2 not found: ID does not exist" containerID="83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.909740 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2"} err="failed to get container status \"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2\": rpc error: code = NotFound desc = could not find container \"83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2\": container with ID starting with 83ae92bb0bb4e4cda117a69473cbaf32a2d6a4ed6ad3199e37516eb721beacd2 not found: ID does not exist" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.909771 4782 scope.go:117] "RemoveContainer" containerID="e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e" Jan 30 18:53:06 crc kubenswrapper[4782]: E0130 18:53:06.910124 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e\": container with ID starting with e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e not found: ID does not exist" containerID="e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.910149 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e"} err="failed to get container status \"e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e\": rpc error: code = NotFound desc = could not find container \"e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e\": container with ID starting with e8b0b35b2613181bbdfe99a1d3d4631f41d7c8dac0a4eb2b2c510a515494c54e not found: ID does not exist" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.910165 4782 scope.go:117] "RemoveContainer" containerID="2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2" Jan 30 18:53:06 crc kubenswrapper[4782]: E0130 18:53:06.910458 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2\": container with ID starting with 2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2 not found: ID does not exist" containerID="2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2" Jan 30 18:53:06 crc kubenswrapper[4782]: I0130 18:53:06.910485 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2"} err="failed to get container status \"2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2\": rpc error: code = NotFound desc = could not find container \"2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2\": container with ID starting with 2ae31921122dd7244948a21de4fe3042c4538ee26cbe5e2e36c457a1201c4ee2 not found: ID does not exist" Jan 30 18:53:07 crc kubenswrapper[4782]: I0130 18:53:07.293150 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d30d28e-c223-4a6c-bae9-be80a2cca85a" (UID: "2d30d28e-c223-4a6c-bae9-be80a2cca85a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:53:07 crc kubenswrapper[4782]: I0130 18:53:07.384780 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d30d28e-c223-4a6c-bae9-be80a2cca85a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:07 crc kubenswrapper[4782]: I0130 18:53:07.441399 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:53:07 crc kubenswrapper[4782]: I0130 18:53:07.456401 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-544h2"] Jan 30 18:53:08 crc kubenswrapper[4782]: I0130 18:53:08.429784 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" path="/var/lib/kubelet/pods/2d30d28e-c223-4a6c-bae9-be80a2cca85a/volumes" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.145455 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.200810 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.201030 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="dnsmasq-dns" containerID="cri-o://5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1" gracePeriod=10 Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.361117 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dcf879fb5-dx4z8"] Jan 30 18:53:12 crc kubenswrapper[4782]: E0130 18:53:12.361520 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="extract-utilities" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.361537 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="extract-utilities" Jan 30 18:53:12 crc kubenswrapper[4782]: E0130 18:53:12.361562 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="registry-server" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.361569 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="registry-server" Jan 30 18:53:12 crc kubenswrapper[4782]: E0130 18:53:12.361583 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="extract-content" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.361589 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="extract-content" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.361788 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d30d28e-c223-4a6c-bae9-be80a2cca85a" containerName="registry-server" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.362791 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.392817 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcf879fb5-dx4z8"] Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.507631 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxvpk\" (UniqueName: \"kubernetes.io/projected/a82aaec0-46a1-4f29-9c09-d4920bd1b315-kube-api-access-jxvpk\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508158 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508195 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-svc\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508267 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508314 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-config\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508383 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.508459 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610661 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610722 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-svc\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610760 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610793 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-config\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610832 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610907 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.610982 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxvpk\" (UniqueName: \"kubernetes.io/projected/a82aaec0-46a1-4f29-9c09-d4920bd1b315-kube-api-access-jxvpk\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.612443 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-nb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.613509 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-ovsdbserver-sb\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.613571 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-openstack-edpm-ipam\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.613695 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-svc\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.614387 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-dns-swift-storage-0\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.614850 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a82aaec0-46a1-4f29-9c09-d4920bd1b315-config\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.631376 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxvpk\" (UniqueName: \"kubernetes.io/projected/a82aaec0-46a1-4f29-9c09-d4920bd1b315-kube-api-access-jxvpk\") pod \"dnsmasq-dns-6dcf879fb5-dx4z8\" (UID: \"a82aaec0-46a1-4f29-9c09-d4920bd1b315\") " pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.693533 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.791660 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.848796 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.849082 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.849133 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm568\" (UniqueName: \"kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.849205 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.849284 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.849370 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb\") pod \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\" (UID: \"fcd01ba4-1545-47eb-8aed-7b7c23b939b5\") " Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.866834 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568" (OuterVolumeSpecName: "kube-api-access-zm568") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "kube-api-access-zm568". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.893344 4782 generic.go:334] "Generic (PLEG): container finished" podID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerID="5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1" exitCode=0 Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.893500 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerDied","Data":"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1"} Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.893660 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" event={"ID":"fcd01ba4-1545-47eb-8aed-7b7c23b939b5","Type":"ContainerDied","Data":"8a40ca7096e01681b6cf3892811423049f51936fb2c4e4d3a7e8dbd9083a0ea1"} Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.893865 4782 scope.go:117] "RemoveContainer" containerID="5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.894010 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8fb4d68c5-ftns8" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.923122 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config" (OuterVolumeSpecName: "config") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.928061 4782 scope.go:117] "RemoveContainer" containerID="fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.928286 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.932033 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.941921 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.952293 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.952321 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.952332 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm568\" (UniqueName: \"kubernetes.io/projected/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-kube-api-access-zm568\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.952341 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.952349 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.961867 4782 scope.go:117] "RemoveContainer" containerID="5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1" Jan 30 18:53:12 crc kubenswrapper[4782]: E0130 18:53:12.963284 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1\": container with ID starting with 5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1 not found: ID does not exist" containerID="5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.963311 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1"} err="failed to get container status \"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1\": rpc error: code = NotFound desc = could not find container \"5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1\": container with ID starting with 5015984c6333799f3c45373fc8df69ec879cd32cd4e691c319a22ce1616b89e1 not found: ID does not exist" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.963331 4782 scope.go:117] "RemoveContainer" containerID="fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf" Jan 30 18:53:12 crc kubenswrapper[4782]: E0130 18:53:12.963541 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf\": container with ID starting with fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf not found: ID does not exist" containerID="fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.963557 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf"} err="failed to get container status \"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf\": rpc error: code = NotFound desc = could not find container \"fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf\": container with ID starting with fce3d8fde02fd79382c50d539cf04027f9aa249f5eb6c4180569118840876bdf not found: ID does not exist" Jan 30 18:53:12 crc kubenswrapper[4782]: I0130 18:53:12.992012 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fcd01ba4-1545-47eb-8aed-7b7c23b939b5" (UID: "fcd01ba4-1545-47eb-8aed-7b7c23b939b5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.054162 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcd01ba4-1545-47eb-8aed-7b7c23b939b5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:13 crc kubenswrapper[4782]: W0130 18:53:13.216796 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda82aaec0_46a1_4f29_9c09_d4920bd1b315.slice/crio-05b0561c25970f08b3880ac5e018ea002f4f0c817bb8414efd9889a4f1326d09 WatchSource:0}: Error finding container 05b0561c25970f08b3880ac5e018ea002f4f0c817bb8414efd9889a4f1326d09: Status 404 returned error can't find the container with id 05b0561c25970f08b3880ac5e018ea002f4f0c817bb8414efd9889a4f1326d09 Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.217465 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dcf879fb5-dx4z8"] Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.232131 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.245219 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8fb4d68c5-ftns8"] Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.908254 4782 generic.go:334] "Generic (PLEG): container finished" podID="a82aaec0-46a1-4f29-9c09-d4920bd1b315" containerID="c3ba6d89e65f1a30fb363d1065bd73a708a1f87bb59181a6f7d52f38e974b7cc" exitCode=0 Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.908379 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" event={"ID":"a82aaec0-46a1-4f29-9c09-d4920bd1b315","Type":"ContainerDied","Data":"c3ba6d89e65f1a30fb363d1065bd73a708a1f87bb59181a6f7d52f38e974b7cc"} Jan 30 18:53:13 crc kubenswrapper[4782]: I0130 18:53:13.908537 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" event={"ID":"a82aaec0-46a1-4f29-9c09-d4920bd1b315","Type":"ContainerStarted","Data":"05b0561c25970f08b3880ac5e018ea002f4f0c817bb8414efd9889a4f1326d09"} Jan 30 18:53:14 crc kubenswrapper[4782]: I0130 18:53:14.423777 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" path="/var/lib/kubelet/pods/fcd01ba4-1545-47eb-8aed-7b7c23b939b5/volumes" Jan 30 18:53:14 crc kubenswrapper[4782]: I0130 18:53:14.920006 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" event={"ID":"a82aaec0-46a1-4f29-9c09-d4920bd1b315","Type":"ContainerStarted","Data":"90764fab4bcea6e2c201b07cd689ddfc64d89eac014ec19e402ba21f43eaf801"} Jan 30 18:53:14 crc kubenswrapper[4782]: I0130 18:53:14.920261 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:14 crc kubenswrapper[4782]: I0130 18:53:14.958262 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" podStartSLOduration=2.958240275 podStartE2EDuration="2.958240275s" podCreationTimestamp="2026-01-30 18:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:53:14.950403792 +0000 UTC m=+1371.218781817" watchObservedRunningTime="2026-01-30 18:53:14.958240275 +0000 UTC m=+1371.226618300" Jan 30 18:53:22 crc kubenswrapper[4782]: I0130 18:53:22.695468 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dcf879fb5-dx4z8" Jan 30 18:53:22 crc kubenswrapper[4782]: I0130 18:53:22.779276 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:22 crc kubenswrapper[4782]: I0130 18:53:22.779840 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="dnsmasq-dns" containerID="cri-o://3741183bd6fd9b04a448be57660fde5c960db5e7458e6953067f8c2e0205314f" gracePeriod=10 Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.019033 4782 generic.go:334] "Generic (PLEG): container finished" podID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerID="3741183bd6fd9b04a448be57660fde5c960db5e7458e6953067f8c2e0205314f" exitCode=0 Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.019074 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" event={"ID":"6c39e5c5-b835-4071-9daa-1b116dc37a02","Type":"ContainerDied","Data":"3741183bd6fd9b04a448be57660fde5c960db5e7458e6953067f8c2e0205314f"} Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.389363 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403146 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403256 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403328 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403355 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403465 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403526 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9cnz\" (UniqueName: \"kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.403544 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0\") pod \"6c39e5c5-b835-4071-9daa-1b116dc37a02\" (UID: \"6c39e5c5-b835-4071-9daa-1b116dc37a02\") " Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.440156 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz" (OuterVolumeSpecName: "kube-api-access-n9cnz") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "kube-api-access-n9cnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.493867 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config" (OuterVolumeSpecName: "config") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.496944 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.503074 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.505655 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-config\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.505761 4782 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.505823 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9cnz\" (UniqueName: \"kubernetes.io/projected/6c39e5c5-b835-4071-9daa-1b116dc37a02-kube-api-access-n9cnz\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.505884 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.506297 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.522122 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.539728 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c39e5c5-b835-4071-9daa-1b116dc37a02" (UID: "6c39e5c5-b835-4071-9daa-1b116dc37a02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.606942 4782 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.606980 4782 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:23 crc kubenswrapper[4782]: I0130 18:53:23.606990 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6c39e5c5-b835-4071-9daa-1b116dc37a02-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.030698 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" event={"ID":"6c39e5c5-b835-4071-9daa-1b116dc37a02","Type":"ContainerDied","Data":"e884195bb716ae468af07892aa8a901cd229c7ce18b71ea3eb22b60960d00d84"} Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.030757 4782 scope.go:117] "RemoveContainer" containerID="3741183bd6fd9b04a448be57660fde5c960db5e7458e6953067f8c2e0205314f" Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.031803 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777789c5ff-hg2f5" Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.062199 4782 scope.go:117] "RemoveContainer" containerID="7cf6db13630a93533027ca71018730402aa9307a89805504a76ed0bc7d8b3b93" Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.064944 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.088438 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-777789c5ff-hg2f5"] Jan 30 18:53:24 crc kubenswrapper[4782]: I0130 18:53:24.424768 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" path="/var/lib/kubelet/pods/6c39e5c5-b835-4071-9daa-1b116dc37a02/volumes" Jan 30 18:53:28 crc kubenswrapper[4782]: I0130 18:53:28.079466 4782 generic.go:334] "Generic (PLEG): container finished" podID="f74ddec0-3f55-44e4-80f4-2d4eac7a9093" containerID="4aad274c793bcc5f17bed7e6b6bdf7ce21dbf406b5f3ec0569dff06015c0c081" exitCode=0 Jan 30 18:53:28 crc kubenswrapper[4782]: I0130 18:53:28.080009 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74ddec0-3f55-44e4-80f4-2d4eac7a9093","Type":"ContainerDied","Data":"4aad274c793bcc5f17bed7e6b6bdf7ce21dbf406b5f3ec0569dff06015c0c081"} Jan 30 18:53:29 crc kubenswrapper[4782]: I0130 18:53:29.092952 4782 generic.go:334] "Generic (PLEG): container finished" podID="75a1a86d-bec9-47a8-9031-21a30029c09d" containerID="0d2ee4189ebab08be8901b6d359168fe0ab8b73ab9d446b6f7a1f90b5187e63a" exitCode=0 Jan 30 18:53:29 crc kubenswrapper[4782]: I0130 18:53:29.093124 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75a1a86d-bec9-47a8-9031-21a30029c09d","Type":"ContainerDied","Data":"0d2ee4189ebab08be8901b6d359168fe0ab8b73ab9d446b6f7a1f90b5187e63a"} Jan 30 18:53:29 crc kubenswrapper[4782]: I0130 18:53:29.098286 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f74ddec0-3f55-44e4-80f4-2d4eac7a9093","Type":"ContainerStarted","Data":"756f8ba57a01b894f6133fb24151ace8739e2643402742af5029f39db7491918"} Jan 30 18:53:29 crc kubenswrapper[4782]: I0130 18:53:29.098671 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 18:53:29 crc kubenswrapper[4782]: I0130 18:53:29.207220 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.20719388 podStartE2EDuration="38.20719388s" podCreationTimestamp="2026-01-30 18:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:53:29.196936417 +0000 UTC m=+1385.465314452" watchObservedRunningTime="2026-01-30 18:53:29.20719388 +0000 UTC m=+1385.475571915" Jan 30 18:53:30 crc kubenswrapper[4782]: I0130 18:53:30.111410 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75a1a86d-bec9-47a8-9031-21a30029c09d","Type":"ContainerStarted","Data":"fe3ed429286271aead20671ceae6abca1fd2a730b7b61545c8a8ea5e6705954f"} Jan 30 18:53:30 crc kubenswrapper[4782]: I0130 18:53:30.111974 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:53:30 crc kubenswrapper[4782]: I0130 18:53:30.133144 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.133121869 podStartE2EDuration="38.133121869s" podCreationTimestamp="2026-01-30 18:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 18:53:30.131618771 +0000 UTC m=+1386.399996796" watchObservedRunningTime="2026-01-30 18:53:30.133121869 +0000 UTC m=+1386.401499904" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.261209 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79"] Jan 30 18:53:41 crc kubenswrapper[4782]: E0130 18:53:41.263665 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.263813 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: E0130 18:53:41.265688 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="init" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.265820 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="init" Jan 30 18:53:41 crc kubenswrapper[4782]: E0130 18:53:41.265964 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="init" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.266156 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="init" Jan 30 18:53:41 crc kubenswrapper[4782]: E0130 18:53:41.266350 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.266603 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.267565 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd01ba4-1545-47eb-8aed-7b7c23b939b5" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.267735 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c39e5c5-b835-4071-9daa-1b116dc37a02" containerName="dnsmasq-dns" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.268968 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.272088 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.272652 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.273323 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.273887 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.287633 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79"] Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.407743 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.407836 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.407988 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.408211 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9kn\" (UniqueName: \"kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.510129 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.510302 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9kn\" (UniqueName: \"kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.510414 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.510580 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.516899 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.517270 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.517567 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.528289 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9kn\" (UniqueName: \"kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:41 crc kubenswrapper[4782]: I0130 18:53:41.593243 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:53:42 crc kubenswrapper[4782]: I0130 18:53:42.063626 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 18:53:42 crc kubenswrapper[4782]: I0130 18:53:42.279391 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79"] Jan 30 18:53:42 crc kubenswrapper[4782]: W0130 18:53:42.286357 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e26ddb_4b47_4b06_a390_76653b75c503.slice/crio-fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208 WatchSource:0}: Error finding container fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208: Status 404 returned error can't find the container with id fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208 Jan 30 18:53:43 crc kubenswrapper[4782]: I0130 18:53:43.246124 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" event={"ID":"77e26ddb-4b47-4b06-a390-76653b75c503","Type":"ContainerStarted","Data":"fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208"} Jan 30 18:53:43 crc kubenswrapper[4782]: I0130 18:53:43.262376 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 18:53:49 crc kubenswrapper[4782]: I0130 18:53:49.793463 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:53:49 crc kubenswrapper[4782]: I0130 18:53:49.799700 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:53:53 crc kubenswrapper[4782]: I0130 18:53:53.354060 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" event={"ID":"77e26ddb-4b47-4b06-a390-76653b75c503","Type":"ContainerStarted","Data":"832d23158188a2aff33fd1b58a446f50b67fda4f27adc3571887e30a386ae9bc"} Jan 30 18:53:53 crc kubenswrapper[4782]: I0130 18:53:53.377659 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" podStartSLOduration=1.7427771189999999 podStartE2EDuration="12.377633732s" podCreationTimestamp="2026-01-30 18:53:41 +0000 UTC" firstStartedPulling="2026-01-30 18:53:42.288998192 +0000 UTC m=+1398.557376217" lastFinishedPulling="2026-01-30 18:53:52.923854775 +0000 UTC m=+1409.192232830" observedRunningTime="2026-01-30 18:53:53.370390673 +0000 UTC m=+1409.638768728" watchObservedRunningTime="2026-01-30 18:53:53.377633732 +0000 UTC m=+1409.646011797" Jan 30 18:54:04 crc kubenswrapper[4782]: I0130 18:54:04.477429 4782 generic.go:334] "Generic (PLEG): container finished" podID="77e26ddb-4b47-4b06-a390-76653b75c503" containerID="832d23158188a2aff33fd1b58a446f50b67fda4f27adc3571887e30a386ae9bc" exitCode=0 Jan 30 18:54:04 crc kubenswrapper[4782]: I0130 18:54:04.477542 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" event={"ID":"77e26ddb-4b47-4b06-a390-76653b75c503","Type":"ContainerDied","Data":"832d23158188a2aff33fd1b58a446f50b67fda4f27adc3571887e30a386ae9bc"} Jan 30 18:54:05 crc kubenswrapper[4782]: I0130 18:54:05.956091 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.143885 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory\") pod \"77e26ddb-4b47-4b06-a390-76653b75c503\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.144338 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam\") pod \"77e26ddb-4b47-4b06-a390-76653b75c503\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.144489 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh9kn\" (UniqueName: \"kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn\") pod \"77e26ddb-4b47-4b06-a390-76653b75c503\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.144688 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle\") pod \"77e26ddb-4b47-4b06-a390-76653b75c503\" (UID: \"77e26ddb-4b47-4b06-a390-76653b75c503\") " Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.152315 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn" (OuterVolumeSpecName: "kube-api-access-kh9kn") pod "77e26ddb-4b47-4b06-a390-76653b75c503" (UID: "77e26ddb-4b47-4b06-a390-76653b75c503"). InnerVolumeSpecName "kube-api-access-kh9kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.153205 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77e26ddb-4b47-4b06-a390-76653b75c503" (UID: "77e26ddb-4b47-4b06-a390-76653b75c503"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.187427 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory" (OuterVolumeSpecName: "inventory") pod "77e26ddb-4b47-4b06-a390-76653b75c503" (UID: "77e26ddb-4b47-4b06-a390-76653b75c503"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.194489 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77e26ddb-4b47-4b06-a390-76653b75c503" (UID: "77e26ddb-4b47-4b06-a390-76653b75c503"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.246539 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.246581 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh9kn\" (UniqueName: \"kubernetes.io/projected/77e26ddb-4b47-4b06-a390-76653b75c503-kube-api-access-kh9kn\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.246596 4782 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.246610 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77e26ddb-4b47-4b06-a390-76653b75c503-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.519577 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" event={"ID":"77e26ddb-4b47-4b06-a390-76653b75c503","Type":"ContainerDied","Data":"fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208"} Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.519626 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe32c7eeb48aab9612b6d6204f127efbd3d55a1077ef8d504921af27bc96b208" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.519698 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.595750 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p"] Jan 30 18:54:06 crc kubenswrapper[4782]: E0130 18:54:06.596196 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e26ddb-4b47-4b06-a390-76653b75c503" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.596213 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e26ddb-4b47-4b06-a390-76653b75c503" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.596466 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e26ddb-4b47-4b06-a390-76653b75c503" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.597193 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.599587 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.599705 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.599592 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.599941 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.605967 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p"] Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.763753 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.764198 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4wx\" (UniqueName: \"kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.764455 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.865914 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.865954 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4wx\" (UniqueName: \"kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.866002 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.870337 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.870884 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.885000 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4wx\" (UniqueName: \"kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-42r9p\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:06 crc kubenswrapper[4782]: I0130 18:54:06.963892 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:07 crc kubenswrapper[4782]: I0130 18:54:07.515996 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p"] Jan 30 18:54:08 crc kubenswrapper[4782]: I0130 18:54:08.543943 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" event={"ID":"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f","Type":"ContainerStarted","Data":"d180ba529bfe92cce11f199fd1996ecfbf7759f6f94b0c508aa83c52768fa327"} Jan 30 18:54:08 crc kubenswrapper[4782]: I0130 18:54:08.544632 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" event={"ID":"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f","Type":"ContainerStarted","Data":"c604c267fe997c493bc59667bbfade3add73859d4725b1e2e1a9bfa204019292"} Jan 30 18:54:08 crc kubenswrapper[4782]: I0130 18:54:08.563539 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" podStartSLOduration=2.08830387 podStartE2EDuration="2.563507227s" podCreationTimestamp="2026-01-30 18:54:06 +0000 UTC" firstStartedPulling="2026-01-30 18:54:07.531738571 +0000 UTC m=+1423.800116606" lastFinishedPulling="2026-01-30 18:54:08.006941918 +0000 UTC m=+1424.275319963" observedRunningTime="2026-01-30 18:54:08.562472401 +0000 UTC m=+1424.830850476" watchObservedRunningTime="2026-01-30 18:54:08.563507227 +0000 UTC m=+1424.831885292" Jan 30 18:54:11 crc kubenswrapper[4782]: I0130 18:54:11.575255 4782 generic.go:334] "Generic (PLEG): container finished" podID="f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" containerID="d180ba529bfe92cce11f199fd1996ecfbf7759f6f94b0c508aa83c52768fa327" exitCode=0 Jan 30 18:54:11 crc kubenswrapper[4782]: I0130 18:54:11.575333 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" event={"ID":"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f","Type":"ContainerDied","Data":"d180ba529bfe92cce11f199fd1996ecfbf7759f6f94b0c508aa83c52768fa327"} Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.129927 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.257610 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh4wx\" (UniqueName: \"kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx\") pod \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.257679 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam\") pod \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.257795 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory\") pod \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\" (UID: \"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f\") " Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.266785 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx" (OuterVolumeSpecName: "kube-api-access-gh4wx") pod "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" (UID: "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f"). InnerVolumeSpecName "kube-api-access-gh4wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.293475 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" (UID: "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.300907 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory" (OuterVolumeSpecName: "inventory") pod "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" (UID: "f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.360254 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh4wx\" (UniqueName: \"kubernetes.io/projected/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-kube-api-access-gh4wx\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.360611 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.360623 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.603475 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" event={"ID":"f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f","Type":"ContainerDied","Data":"c604c267fe997c493bc59667bbfade3add73859d4725b1e2e1a9bfa204019292"} Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.603517 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c604c267fe997c493bc59667bbfade3add73859d4725b1e2e1a9bfa204019292" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.603567 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-42r9p" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.681616 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj"] Jan 30 18:54:13 crc kubenswrapper[4782]: E0130 18:54:13.681991 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.682008 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.682221 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.682997 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.684947 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.688650 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.689173 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.689414 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.709211 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj"] Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.870595 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.870654 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8h4z\" (UniqueName: \"kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.870809 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.870859 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.973109 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.973185 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.973283 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.973349 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8h4z\" (UniqueName: \"kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.978820 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.983171 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:13 crc kubenswrapper[4782]: I0130 18:54:13.995266 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:14 crc kubenswrapper[4782]: I0130 18:54:14.003145 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8h4z\" (UniqueName: \"kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:14 crc kubenswrapper[4782]: I0130 18:54:14.006959 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:54:14 crc kubenswrapper[4782]: I0130 18:54:14.588339 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj"] Jan 30 18:54:14 crc kubenswrapper[4782]: I0130 18:54:14.618095 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" event={"ID":"f9e549bf-994f-46e6-9d42-72a655229b73","Type":"ContainerStarted","Data":"8f3b03f818a966cfbb55b6c19b8a312310bc98b8cb6aaba5d53cd30c43beed59"} Jan 30 18:54:15 crc kubenswrapper[4782]: I0130 18:54:15.632967 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" event={"ID":"f9e549bf-994f-46e6-9d42-72a655229b73","Type":"ContainerStarted","Data":"115bf5a91e98c4b7d96cd51866add631ceb8404199849ab7328267c6ccad03b4"} Jan 30 18:54:15 crc kubenswrapper[4782]: I0130 18:54:15.662279 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" podStartSLOduration=1.94150883 podStartE2EDuration="2.662254407s" podCreationTimestamp="2026-01-30 18:54:13 +0000 UTC" firstStartedPulling="2026-01-30 18:54:14.594304457 +0000 UTC m=+1430.862682492" lastFinishedPulling="2026-01-30 18:54:15.315050004 +0000 UTC m=+1431.583428069" observedRunningTime="2026-01-30 18:54:15.65590573 +0000 UTC m=+1431.924283795" watchObservedRunningTime="2026-01-30 18:54:15.662254407 +0000 UTC m=+1431.930632472" Jan 30 18:54:19 crc kubenswrapper[4782]: I0130 18:54:19.792308 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:54:19 crc kubenswrapper[4782]: I0130 18:54:19.792668 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:54:22 crc kubenswrapper[4782]: I0130 18:54:22.170457 4782 scope.go:117] "RemoveContainer" containerID="32774d969d034474cf88dae3b5c042e29584f630eda4fec4dc8a5aee101211af" Jan 30 18:54:22 crc kubenswrapper[4782]: I0130 18:54:22.243861 4782 scope.go:117] "RemoveContainer" containerID="5bee7ff4bdab2b1e98536ca8e9369f2b7f05e4fcf60259e2742a3240d8fd2d4f" Jan 30 18:54:22 crc kubenswrapper[4782]: I0130 18:54:22.304109 4782 scope.go:117] "RemoveContainer" containerID="e711fa721ad0a82a8f463a6ab7d1c38710fc1b9a7e3d2c723b82eee21147acbb" Jan 30 18:54:49 crc kubenswrapper[4782]: I0130 18:54:49.793125 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:54:49 crc kubenswrapper[4782]: I0130 18:54:49.793917 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:54:49 crc kubenswrapper[4782]: I0130 18:54:49.793987 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:54:49 crc kubenswrapper[4782]: I0130 18:54:49.795481 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:54:49 crc kubenswrapper[4782]: I0130 18:54:49.795590 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748" gracePeriod=600 Jan 30 18:54:50 crc kubenswrapper[4782]: I0130 18:54:50.040848 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748" exitCode=0 Jan 30 18:54:50 crc kubenswrapper[4782]: I0130 18:54:50.041062 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748"} Jan 30 18:54:50 crc kubenswrapper[4782]: I0130 18:54:50.041179 4782 scope.go:117] "RemoveContainer" containerID="30752c64226ba6f7e596e12313e1d813b202b07c8fa5c5bad850072993bf2126" Jan 30 18:54:51 crc kubenswrapper[4782]: I0130 18:54:51.052630 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a"} Jan 30 18:55:22 crc kubenswrapper[4782]: I0130 18:55:22.459124 4782 scope.go:117] "RemoveContainer" containerID="07829a177d4c5bb5f2e54de53ef57ae9464ada31d0843e33381de37958a880f6" Jan 30 18:56:22 crc kubenswrapper[4782]: I0130 18:56:22.542472 4782 scope.go:117] "RemoveContainer" containerID="9454f90cec4c047dd7f4d13b8551aa39d63486a6d2cdfa864f0b839315136218" Jan 30 18:56:22 crc kubenswrapper[4782]: I0130 18:56:22.572585 4782 scope.go:117] "RemoveContainer" containerID="a873fad9675ee26d9360b345482b465a6a465f5d77368b767258ac24b5b72bf2" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.378852 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.387553 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.414117 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.516934 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.517159 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kb9v\" (UniqueName: \"kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.517867 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.621041 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.621331 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.621663 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kb9v\" (UniqueName: \"kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.622088 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.622652 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.649132 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kb9v\" (UniqueName: \"kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v\") pod \"certified-operators-nggcx\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:37 crc kubenswrapper[4782]: I0130 18:56:37.723029 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:38 crc kubenswrapper[4782]: I0130 18:56:38.231499 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:38 crc kubenswrapper[4782]: I0130 18:56:38.363216 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerStarted","Data":"fddedc9de2b2fab85a8bd0b352972c4c38e51b22ec4e54e8864fcac4c2c638e4"} Jan 30 18:56:39 crc kubenswrapper[4782]: I0130 18:56:39.374933 4782 generic.go:334] "Generic (PLEG): container finished" podID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerID="d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09" exitCode=0 Jan 30 18:56:39 crc kubenswrapper[4782]: I0130 18:56:39.375073 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerDied","Data":"d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09"} Jan 30 18:56:39 crc kubenswrapper[4782]: I0130 18:56:39.377725 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 18:56:41 crc kubenswrapper[4782]: I0130 18:56:41.394889 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerStarted","Data":"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf"} Jan 30 18:56:42 crc kubenswrapper[4782]: I0130 18:56:42.409104 4782 generic.go:334] "Generic (PLEG): container finished" podID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerID="f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf" exitCode=0 Jan 30 18:56:42 crc kubenswrapper[4782]: I0130 18:56:42.409353 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerDied","Data":"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf"} Jan 30 18:56:43 crc kubenswrapper[4782]: I0130 18:56:43.422062 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerStarted","Data":"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc"} Jan 30 18:56:43 crc kubenswrapper[4782]: I0130 18:56:43.454039 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nggcx" podStartSLOduration=2.996576314 podStartE2EDuration="6.454018124s" podCreationTimestamp="2026-01-30 18:56:37 +0000 UTC" firstStartedPulling="2026-01-30 18:56:39.377359918 +0000 UTC m=+1575.645737953" lastFinishedPulling="2026-01-30 18:56:42.834801738 +0000 UTC m=+1579.103179763" observedRunningTime="2026-01-30 18:56:43.445450773 +0000 UTC m=+1579.713828818" watchObservedRunningTime="2026-01-30 18:56:43.454018124 +0000 UTC m=+1579.722396159" Jan 30 18:56:47 crc kubenswrapper[4782]: I0130 18:56:47.724140 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:47 crc kubenswrapper[4782]: I0130 18:56:47.724810 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:47 crc kubenswrapper[4782]: I0130 18:56:47.803455 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:48 crc kubenswrapper[4782]: I0130 18:56:48.516694 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:48 crc kubenswrapper[4782]: I0130 18:56:48.572671 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:50 crc kubenswrapper[4782]: I0130 18:56:50.505334 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nggcx" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="registry-server" containerID="cri-o://b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc" gracePeriod=2 Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.017690 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.095425 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities\") pod \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.095537 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content\") pod \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.095577 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kb9v\" (UniqueName: \"kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v\") pod \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\" (UID: \"b7f07008-c7f9-4623-ae19-a8e1965a6f2a\") " Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.096701 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities" (OuterVolumeSpecName: "utilities") pod "b7f07008-c7f9-4623-ae19-a8e1965a6f2a" (UID: "b7f07008-c7f9-4623-ae19-a8e1965a6f2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.101212 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v" (OuterVolumeSpecName: "kube-api-access-9kb9v") pod "b7f07008-c7f9-4623-ae19-a8e1965a6f2a" (UID: "b7f07008-c7f9-4623-ae19-a8e1965a6f2a"). InnerVolumeSpecName "kube-api-access-9kb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.148958 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7f07008-c7f9-4623-ae19-a8e1965a6f2a" (UID: "b7f07008-c7f9-4623-ae19-a8e1965a6f2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.197620 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.197658 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.197675 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kb9v\" (UniqueName: \"kubernetes.io/projected/b7f07008-c7f9-4623-ae19-a8e1965a6f2a-kube-api-access-9kb9v\") on node \"crc\" DevicePath \"\"" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.519408 4782 generic.go:334] "Generic (PLEG): container finished" podID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerID="b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc" exitCode=0 Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.519455 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerDied","Data":"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc"} Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.519494 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nggcx" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.519528 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nggcx" event={"ID":"b7f07008-c7f9-4623-ae19-a8e1965a6f2a","Type":"ContainerDied","Data":"fddedc9de2b2fab85a8bd0b352972c4c38e51b22ec4e54e8864fcac4c2c638e4"} Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.519559 4782 scope.go:117] "RemoveContainer" containerID="b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.545494 4782 scope.go:117] "RemoveContainer" containerID="f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.581886 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.589662 4782 scope.go:117] "RemoveContainer" containerID="d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.595506 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nggcx"] Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.615916 4782 scope.go:117] "RemoveContainer" containerID="b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc" Jan 30 18:56:51 crc kubenswrapper[4782]: E0130 18:56:51.616490 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc\": container with ID starting with b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc not found: ID does not exist" containerID="b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.616532 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc"} err="failed to get container status \"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc\": rpc error: code = NotFound desc = could not find container \"b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc\": container with ID starting with b19fe6e8528854ed6c3a484911a7b2d38069aeec2d1e73d9c39afdc59e45b1cc not found: ID does not exist" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.616565 4782 scope.go:117] "RemoveContainer" containerID="f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf" Jan 30 18:56:51 crc kubenswrapper[4782]: E0130 18:56:51.616999 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf\": container with ID starting with f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf not found: ID does not exist" containerID="f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.617043 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf"} err="failed to get container status \"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf\": rpc error: code = NotFound desc = could not find container \"f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf\": container with ID starting with f4acd07a28d1b7793b9c8e240567a239080fb27ae6f461a6ba61fd540bb35ddf not found: ID does not exist" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.617069 4782 scope.go:117] "RemoveContainer" containerID="d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09" Jan 30 18:56:51 crc kubenswrapper[4782]: E0130 18:56:51.617624 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09\": container with ID starting with d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09 not found: ID does not exist" containerID="d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09" Jan 30 18:56:51 crc kubenswrapper[4782]: I0130 18:56:51.617669 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09"} err="failed to get container status \"d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09\": rpc error: code = NotFound desc = could not find container \"d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09\": container with ID starting with d6c58f6202c2888152ba725c9bad4da89a3a42b97799762f0c9b247bc4846f09 not found: ID does not exist" Jan 30 18:56:52 crc kubenswrapper[4782]: I0130 18:56:52.426889 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" path="/var/lib/kubelet/pods/b7f07008-c7f9-4623-ae19-a8e1965a6f2a/volumes" Jan 30 18:57:08 crc kubenswrapper[4782]: I0130 18:57:08.717095 4782 generic.go:334] "Generic (PLEG): container finished" podID="f9e549bf-994f-46e6-9d42-72a655229b73" containerID="115bf5a91e98c4b7d96cd51866add631ceb8404199849ab7328267c6ccad03b4" exitCode=0 Jan 30 18:57:08 crc kubenswrapper[4782]: I0130 18:57:08.717183 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" event={"ID":"f9e549bf-994f-46e6-9d42-72a655229b73","Type":"ContainerDied","Data":"115bf5a91e98c4b7d96cd51866add631ceb8404199849ab7328267c6ccad03b4"} Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.150732 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.313097 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle\") pod \"f9e549bf-994f-46e6-9d42-72a655229b73\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.313521 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8h4z\" (UniqueName: \"kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z\") pod \"f9e549bf-994f-46e6-9d42-72a655229b73\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.313544 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory\") pod \"f9e549bf-994f-46e6-9d42-72a655229b73\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.313715 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam\") pod \"f9e549bf-994f-46e6-9d42-72a655229b73\" (UID: \"f9e549bf-994f-46e6-9d42-72a655229b73\") " Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.319167 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z" (OuterVolumeSpecName: "kube-api-access-r8h4z") pod "f9e549bf-994f-46e6-9d42-72a655229b73" (UID: "f9e549bf-994f-46e6-9d42-72a655229b73"). InnerVolumeSpecName "kube-api-access-r8h4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.322266 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f9e549bf-994f-46e6-9d42-72a655229b73" (UID: "f9e549bf-994f-46e6-9d42-72a655229b73"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.351488 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9e549bf-994f-46e6-9d42-72a655229b73" (UID: "f9e549bf-994f-46e6-9d42-72a655229b73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.359813 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory" (OuterVolumeSpecName: "inventory") pod "f9e549bf-994f-46e6-9d42-72a655229b73" (UID: "f9e549bf-994f-46e6-9d42-72a655229b73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.415713 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8h4z\" (UniqueName: \"kubernetes.io/projected/f9e549bf-994f-46e6-9d42-72a655229b73-kube-api-access-r8h4z\") on node \"crc\" DevicePath \"\"" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.415740 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.415750 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.415759 4782 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e549bf-994f-46e6-9d42-72a655229b73-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.743770 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" event={"ID":"f9e549bf-994f-46e6-9d42-72a655229b73","Type":"ContainerDied","Data":"8f3b03f818a966cfbb55b6c19b8a312310bc98b8cb6aaba5d53cd30c43beed59"} Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.743819 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3b03f818a966cfbb55b6c19b8a312310bc98b8cb6aaba5d53cd30c43beed59" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.743822 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.833838 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf"] Jan 30 18:57:10 crc kubenswrapper[4782]: E0130 18:57:10.834738 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="registry-server" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.834789 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="registry-server" Jan 30 18:57:10 crc kubenswrapper[4782]: E0130 18:57:10.834828 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="extract-utilities" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.834846 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="extract-utilities" Jan 30 18:57:10 crc kubenswrapper[4782]: E0130 18:57:10.834912 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e549bf-994f-46e6-9d42-72a655229b73" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.834932 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e549bf-994f-46e6-9d42-72a655229b73" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 18:57:10 crc kubenswrapper[4782]: E0130 18:57:10.834985 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="extract-content" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.835001 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="extract-content" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.835493 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e549bf-994f-46e6-9d42-72a655229b73" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.835565 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f07008-c7f9-4623-ae19-a8e1965a6f2a" containerName="registry-server" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.837074 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.839697 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.840162 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.840542 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.840731 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.845966 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf"] Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.925812 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnxm7\" (UniqueName: \"kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.925861 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:10 crc kubenswrapper[4782]: I0130 18:57:10.926018 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.028801 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnxm7\" (UniqueName: \"kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.028874 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.028945 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.037257 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.037578 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.046741 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnxm7\" (UniqueName: \"kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-58pjf\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.217061 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:57:11 crc kubenswrapper[4782]: I0130 18:57:11.772355 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf"] Jan 30 18:57:12 crc kubenswrapper[4782]: I0130 18:57:12.778779 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" event={"ID":"92905892-4424-4957-a945-eb130f92d03f","Type":"ContainerStarted","Data":"9aaacb485fac79af73f602a9875462fbabdfec08c919fddc39793ef3f006fa50"} Jan 30 18:57:12 crc kubenswrapper[4782]: I0130 18:57:12.779202 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" event={"ID":"92905892-4424-4957-a945-eb130f92d03f","Type":"ContainerStarted","Data":"ca8b5c6be9bf7f9a099d4b51f9edf2948e8e48d8e1a53eecab558f2e35fa9996"} Jan 30 18:57:12 crc kubenswrapper[4782]: I0130 18:57:12.812361 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" podStartSLOduration=2.397235659 podStartE2EDuration="2.812336801s" podCreationTimestamp="2026-01-30 18:57:10 +0000 UTC" firstStartedPulling="2026-01-30 18:57:11.774763112 +0000 UTC m=+1608.043141147" lastFinishedPulling="2026-01-30 18:57:12.189864264 +0000 UTC m=+1608.458242289" observedRunningTime="2026-01-30 18:57:12.80379345 +0000 UTC m=+1609.072171515" watchObservedRunningTime="2026-01-30 18:57:12.812336801 +0000 UTC m=+1609.080714856" Jan 30 18:57:19 crc kubenswrapper[4782]: I0130 18:57:19.792645 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:57:19 crc kubenswrapper[4782]: I0130 18:57:19.793215 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:57:22 crc kubenswrapper[4782]: I0130 18:57:22.646428 4782 scope.go:117] "RemoveContainer" containerID="79dacf1e12cdbe932432a7b02ab32d021eaf66b9041594df821d25a32203eeb0" Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.065654 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2b50-account-create-update-sbscq"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.082748 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d17b-account-create-update-lswx2"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.099506 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2b50-account-create-update-sbscq"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.110730 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m5tvv"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.122126 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d17b-account-create-update-lswx2"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.132807 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m5tvv"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.140833 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xj8zj"] Jan 30 18:57:45 crc kubenswrapper[4782]: I0130 18:57:45.148890 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xj8zj"] Jan 30 18:57:46 crc kubenswrapper[4782]: I0130 18:57:46.422355 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5676c98c-0259-43b5-a04f-12e9f8f74746" path="/var/lib/kubelet/pods/5676c98c-0259-43b5-a04f-12e9f8f74746/volumes" Jan 30 18:57:46 crc kubenswrapper[4782]: I0130 18:57:46.424016 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d82414-65bf-4612-8255-097d4e82b25b" path="/var/lib/kubelet/pods/80d82414-65bf-4612-8255-097d4e82b25b/volumes" Jan 30 18:57:46 crc kubenswrapper[4782]: I0130 18:57:46.425266 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1" path="/var/lib/kubelet/pods/b0fbb7fb-980a-4b5c-b6ca-5e77d3e3a2e1/volumes" Jan 30 18:57:46 crc kubenswrapper[4782]: I0130 18:57:46.426686 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18bb25b-4060-4f0d-8856-3e90a46209d9" path="/var/lib/kubelet/pods/b18bb25b-4060-4f0d-8856-3e90a46209d9/volumes" Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.031804 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-75b9-account-create-update-sdvxj"] Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.043897 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-svqmg"] Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.060458 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-75b9-account-create-update-sdvxj"] Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.068745 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-svqmg"] Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.426169 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452e963a-2af0-417d-a9f2-4ce3490829e3" path="/var/lib/kubelet/pods/452e963a-2af0-417d-a9f2-4ce3490829e3/volumes" Jan 30 18:57:48 crc kubenswrapper[4782]: I0130 18:57:48.430079 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c449eb35-703e-4c85-b4ec-52918bedb59d" path="/var/lib/kubelet/pods/c449eb35-703e-4c85-b4ec-52918bedb59d/volumes" Jan 30 18:57:49 crc kubenswrapper[4782]: I0130 18:57:49.792725 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:57:49 crc kubenswrapper[4782]: I0130 18:57:49.792817 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:57:52 crc kubenswrapper[4782]: I0130 18:57:52.043974 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-49bsv"] Jan 30 18:57:52 crc kubenswrapper[4782]: I0130 18:57:52.060681 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-49bsv"] Jan 30 18:57:52 crc kubenswrapper[4782]: I0130 18:57:52.426089 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d0227d7-251c-4e1b-ac70-2c923e735208" path="/var/lib/kubelet/pods/0d0227d7-251c-4e1b-ac70-2c923e735208/volumes" Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.050691 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-430c-account-create-update-snr88"] Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.065534 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vddgn"] Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.076865 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vddgn"] Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.087417 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-430c-account-create-update-snr88"] Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.096482 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-48gwt"] Jan 30 18:58:01 crc kubenswrapper[4782]: I0130 18:58:01.104748 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-48gwt"] Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.033709 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-cff8-account-create-update-dnxld"] Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.044033 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-cff8-account-create-update-dnxld"] Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.443074 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcdb105-a248-440d-89b4-8c4ad0420ecc" path="/var/lib/kubelet/pods/8bcdb105-a248-440d-89b4-8c4ad0420ecc/volumes" Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.445307 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c7a692-b10f-4799-bae7-8e2d88d26d22" path="/var/lib/kubelet/pods/d7c7a692-b10f-4799-bae7-8e2d88d26d22/volumes" Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.447267 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee" path="/var/lib/kubelet/pods/dbbf9aea-fd5b-4f82-a2e1-c9a2cb2c2dee/volumes" Jan 30 18:58:02 crc kubenswrapper[4782]: I0130 18:58:02.448440 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1d4b19-02b2-4c3b-9e1a-eb9505b41503" path="/var/lib/kubelet/pods/fe1d4b19-02b2-4c3b-9e1a-eb9505b41503/volumes" Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.034828 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lqdnd"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.045386 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lqdnd"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.056520 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rkfpr"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.068851 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5421-account-create-update-5nkvt"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.076986 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rkfpr"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.084087 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5421-account-create-update-5nkvt"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.091909 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-cac0-account-create-update-khnrr"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.099090 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-cac0-account-create-update-khnrr"] Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.426620 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f488b46-84ff-405e-abbe-87d54f15a5f5" path="/var/lib/kubelet/pods/0f488b46-84ff-405e-abbe-87d54f15a5f5/volumes" Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.429333 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d2e1f3-f06f-44b0-81a0-9b1117aea096" path="/var/lib/kubelet/pods/69d2e1f3-f06f-44b0-81a0-9b1117aea096/volumes" Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.431722 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce13f8f-f62e-426f-b59b-6a15cb3237bb" path="/var/lib/kubelet/pods/7ce13f8f-f62e-426f-b59b-6a15cb3237bb/volumes" Jan 30 18:58:10 crc kubenswrapper[4782]: I0130 18:58:10.433769 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c361db-acc1-4cf3-aff9-f66661a2e327" path="/var/lib/kubelet/pods/a8c361db-acc1-4cf3-aff9-f66661a2e327/volumes" Jan 30 18:58:19 crc kubenswrapper[4782]: I0130 18:58:19.793106 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 18:58:19 crc kubenswrapper[4782]: I0130 18:58:19.793756 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 18:58:19 crc kubenswrapper[4782]: I0130 18:58:19.793808 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 18:58:19 crc kubenswrapper[4782]: I0130 18:58:19.794649 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 18:58:19 crc kubenswrapper[4782]: I0130 18:58:19.794720 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" gracePeriod=600 Jan 30 18:58:19 crc kubenswrapper[4782]: E0130 18:58:19.919039 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.050144 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-lnnb7"] Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.061222 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c4cx2"] Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.069923 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-lnnb7"] Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.077564 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c4cx2"] Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.431672 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e94436-5c08-4fa9-8e93-8929251269ff" path="/var/lib/kubelet/pods/16e94436-5c08-4fa9-8e93-8929251269ff/volumes" Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.433145 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa667b9-2c1d-4219-9f23-666323d7f509" path="/var/lib/kubelet/pods/5aa667b9-2c1d-4219-9f23-666323d7f509/volumes" Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.517576 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" exitCode=0 Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.517621 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a"} Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.517658 4782 scope.go:117] "RemoveContainer" containerID="e2877db38d8b8ea57e1667182b8a07ac85a48d7c731015462509e5b9de4f7748" Jan 30 18:58:20 crc kubenswrapper[4782]: I0130 18:58:20.518302 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:58:20 crc kubenswrapper[4782]: E0130 18:58:20.518551 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:58:22 crc kubenswrapper[4782]: I0130 18:58:22.742114 4782 scope.go:117] "RemoveContainer" containerID="5c7bdfb1bdb95d8fd0d941e12ab706c53dab9816c94d120ff51705f269413965" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.011720 4782 scope.go:117] "RemoveContainer" containerID="27b16233d6c6435ae002778664c2f5e610da53503977ac9cbffc76a512829a6a" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.041953 4782 scope.go:117] "RemoveContainer" containerID="6d375d9916b814b6d8de486a8387ae232b7c4a3f238ca32c98eaf2f71b4265c5" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.080176 4782 scope.go:117] "RemoveContainer" containerID="28014afbc1904184eb012bb140f92300a8fc6040f9c157d149230c64231a20bb" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.141689 4782 scope.go:117] "RemoveContainer" containerID="c76799fd89750a18b8e322f9aa27474cace0504e98179f692441c6108201ddfb" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.185141 4782 scope.go:117] "RemoveContainer" containerID="e9c029ed1031ab8f9741c88e25f7433f0516f565f0ca555b47f2321112cca9d0" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.244528 4782 scope.go:117] "RemoveContainer" containerID="5da084ba64db7ade035b8ed1dd0a82f59c530670ca4f5e1c2fd28bbf8cec3adf" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.284043 4782 scope.go:117] "RemoveContainer" containerID="d4d72ca3e929937c0055d22ab8a6d716ebb4e8e358a6caa2a13cec5ae25e6262" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.312573 4782 scope.go:117] "RemoveContainer" containerID="c62d4f14221ba5405a0d83b0d4b68cdf23119cd81602a31b59ea10e455587c48" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.339616 4782 scope.go:117] "RemoveContainer" containerID="ec684cc32e76bdf862e44c6145b5c52e5f6176da735b769e01dfbdd2e1617872" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.372191 4782 scope.go:117] "RemoveContainer" containerID="63f5072ad71d5e389523e43b61f20d7de63e0a51efa92955161b53c1927f0abf" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.400178 4782 scope.go:117] "RemoveContainer" containerID="e8123a25d952fe4b1c57a157863df1d87c46b680beabbe16a538608abe0b8bfc" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.422436 4782 scope.go:117] "RemoveContainer" containerID="11f9764b35b7c74af148cec5204cd14d3224b9f03d6b3a2b0344235f061774d3" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.449445 4782 scope.go:117] "RemoveContainer" containerID="b183169d7e6af3e3159dcba10baa92df856271979556ec5a652b91f7470e9c0e" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.479403 4782 scope.go:117] "RemoveContainer" containerID="eef16c5dad318cdaffe998186a3495b644a699b58c83f4c776a66d1dbd763684" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.514049 4782 scope.go:117] "RemoveContainer" containerID="765683b4de05082bbce5d7d676196e5d33546263498f7e3b30d6630bc3b582df" Jan 30 18:58:23 crc kubenswrapper[4782]: I0130 18:58:23.541382 4782 scope.go:117] "RemoveContainer" containerID="e12bf6139dea373825462c63db019c3c88d79fb991c65b2d2912268e4bedd630" Jan 30 18:58:31 crc kubenswrapper[4782]: I0130 18:58:31.411515 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:58:31 crc kubenswrapper[4782]: E0130 18:58:31.412469 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:58:36 crc kubenswrapper[4782]: I0130 18:58:36.759162 4782 generic.go:334] "Generic (PLEG): container finished" podID="92905892-4424-4957-a945-eb130f92d03f" containerID="9aaacb485fac79af73f602a9875462fbabdfec08c919fddc39793ef3f006fa50" exitCode=0 Jan 30 18:58:36 crc kubenswrapper[4782]: I0130 18:58:36.759369 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" event={"ID":"92905892-4424-4957-a945-eb130f92d03f","Type":"ContainerDied","Data":"9aaacb485fac79af73f602a9875462fbabdfec08c919fddc39793ef3f006fa50"} Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.219460 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.397523 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam\") pod \"92905892-4424-4957-a945-eb130f92d03f\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.398985 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory\") pod \"92905892-4424-4957-a945-eb130f92d03f\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.399127 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnxm7\" (UniqueName: \"kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7\") pod \"92905892-4424-4957-a945-eb130f92d03f\" (UID: \"92905892-4424-4957-a945-eb130f92d03f\") " Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.440125 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7" (OuterVolumeSpecName: "kube-api-access-rnxm7") pod "92905892-4424-4957-a945-eb130f92d03f" (UID: "92905892-4424-4957-a945-eb130f92d03f"). InnerVolumeSpecName "kube-api-access-rnxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.452135 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "92905892-4424-4957-a945-eb130f92d03f" (UID: "92905892-4424-4957-a945-eb130f92d03f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.457424 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory" (OuterVolumeSpecName: "inventory") pod "92905892-4424-4957-a945-eb130f92d03f" (UID: "92905892-4424-4957-a945-eb130f92d03f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.504048 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.504094 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92905892-4424-4957-a945-eb130f92d03f-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.504110 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnxm7\" (UniqueName: \"kubernetes.io/projected/92905892-4424-4957-a945-eb130f92d03f-kube-api-access-rnxm7\") on node \"crc\" DevicePath \"\"" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.791368 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" event={"ID":"92905892-4424-4957-a945-eb130f92d03f","Type":"ContainerDied","Data":"ca8b5c6be9bf7f9a099d4b51f9edf2948e8e48d8e1a53eecab558f2e35fa9996"} Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.791420 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca8b5c6be9bf7f9a099d4b51f9edf2948e8e48d8e1a53eecab558f2e35fa9996" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.791445 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-58pjf" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.881126 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg"] Jan 30 18:58:38 crc kubenswrapper[4782]: E0130 18:58:38.882076 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92905892-4424-4957-a945-eb130f92d03f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.882101 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="92905892-4424-4957-a945-eb130f92d03f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.882395 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="92905892-4424-4957-a945-eb130f92d03f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.883344 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.885545 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.885824 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.886200 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.886736 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.898880 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg"] Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.909605 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.910078 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wth\" (UniqueName: \"kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:38 crc kubenswrapper[4782]: I0130 18:58:38.910155 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.011475 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wth\" (UniqueName: \"kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.011527 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.011589 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.019459 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.019507 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.041013 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wth\" (UniqueName: \"kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.199166 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:58:39 crc kubenswrapper[4782]: I0130 18:58:39.806309 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg"] Jan 30 18:58:40 crc kubenswrapper[4782]: I0130 18:58:40.817623 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" event={"ID":"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566","Type":"ContainerStarted","Data":"354a83c3f533cac069d729cd6bc2f42312811d003e1836b4e271880e4bc87f41"} Jan 30 18:58:40 crc kubenswrapper[4782]: I0130 18:58:40.818343 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" event={"ID":"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566","Type":"ContainerStarted","Data":"7d2e04b87c3bec6555074e55676844a80216bfc6a3ab2d721fcbe35a0b949bf1"} Jan 30 18:58:40 crc kubenswrapper[4782]: I0130 18:58:40.843041 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" podStartSLOduration=2.402880264 podStartE2EDuration="2.842999714s" podCreationTimestamp="2026-01-30 18:58:38 +0000 UTC" firstStartedPulling="2026-01-30 18:58:39.820192296 +0000 UTC m=+1696.088570321" lastFinishedPulling="2026-01-30 18:58:40.260311706 +0000 UTC m=+1696.528689771" observedRunningTime="2026-01-30 18:58:40.838482502 +0000 UTC m=+1697.106860527" watchObservedRunningTime="2026-01-30 18:58:40.842999714 +0000 UTC m=+1697.111377749" Jan 30 18:58:44 crc kubenswrapper[4782]: I0130 18:58:44.423762 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:58:44 crc kubenswrapper[4782]: E0130 18:58:44.424371 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:58:58 crc kubenswrapper[4782]: I0130 18:58:58.411318 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:58:58 crc kubenswrapper[4782]: E0130 18:58:58.412259 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:59:11 crc kubenswrapper[4782]: I0130 18:59:11.411926 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:59:11 crc kubenswrapper[4782]: E0130 18:59:11.413798 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:59:22 crc kubenswrapper[4782]: I0130 18:59:22.411752 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:59:22 crc kubenswrapper[4782]: E0130 18:59:22.413060 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:59:32 crc kubenswrapper[4782]: I0130 18:59:32.072474 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h46v5"] Jan 30 18:59:32 crc kubenswrapper[4782]: I0130 18:59:32.083620 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h46v5"] Jan 30 18:59:32 crc kubenswrapper[4782]: I0130 18:59:32.423424 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17105ad5-4156-493d-95ee-e27a1d4e8622" path="/var/lib/kubelet/pods/17105ad5-4156-493d-95ee-e27a1d4e8622/volumes" Jan 30 18:59:34 crc kubenswrapper[4782]: I0130 18:59:34.418769 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:59:34 crc kubenswrapper[4782]: E0130 18:59:34.419426 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:59:39 crc kubenswrapper[4782]: I0130 18:59:39.043614 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-p9dwp"] Jan 30 18:59:39 crc kubenswrapper[4782]: I0130 18:59:39.058278 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-p9dwp"] Jan 30 18:59:40 crc kubenswrapper[4782]: I0130 18:59:40.422906 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0087b5d-ef94-4433-9e0e-23b509dd3003" path="/var/lib/kubelet/pods/c0087b5d-ef94-4433-9e0e-23b509dd3003/volumes" Jan 30 18:59:45 crc kubenswrapper[4782]: I0130 18:59:45.069023 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-f7fq6"] Jan 30 18:59:45 crc kubenswrapper[4782]: I0130 18:59:45.089360 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-f7fq6"] Jan 30 18:59:46 crc kubenswrapper[4782]: I0130 18:59:46.036019 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q8pml"] Jan 30 18:59:46 crc kubenswrapper[4782]: I0130 18:59:46.046671 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q8pml"] Jan 30 18:59:46 crc kubenswrapper[4782]: I0130 18:59:46.429921 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47aa0756-718b-4d1c-bef5-318895ee6c90" path="/var/lib/kubelet/pods/47aa0756-718b-4d1c-bef5-318895ee6c90/volumes" Jan 30 18:59:46 crc kubenswrapper[4782]: I0130 18:59:46.432058 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ffaa09-371e-4549-a56f-11d6734ff40e" path="/var/lib/kubelet/pods/63ffaa09-371e-4549-a56f-11d6734ff40e/volumes" Jan 30 18:59:47 crc kubenswrapper[4782]: I0130 18:59:47.049463 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mpvf8"] Jan 30 18:59:47 crc kubenswrapper[4782]: I0130 18:59:47.070627 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mpvf8"] Jan 30 18:59:47 crc kubenswrapper[4782]: I0130 18:59:47.614315 4782 generic.go:334] "Generic (PLEG): container finished" podID="c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" containerID="354a83c3f533cac069d729cd6bc2f42312811d003e1836b4e271880e4bc87f41" exitCode=0 Jan 30 18:59:47 crc kubenswrapper[4782]: I0130 18:59:47.614412 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" event={"ID":"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566","Type":"ContainerDied","Data":"354a83c3f533cac069d729cd6bc2f42312811d003e1836b4e271880e4bc87f41"} Jan 30 18:59:48 crc kubenswrapper[4782]: I0130 18:59:48.422590 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ddf9ab-a21e-4d41-b795-c0c926e38a1e" path="/var/lib/kubelet/pods/a9ddf9ab-a21e-4d41-b795-c0c926e38a1e/volumes" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.189311 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.309691 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8wth\" (UniqueName: \"kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth\") pod \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.309929 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory\") pod \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.310028 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam\") pod \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\" (UID: \"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566\") " Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.316324 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth" (OuterVolumeSpecName: "kube-api-access-d8wth") pod "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" (UID: "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566"). InnerVolumeSpecName "kube-api-access-d8wth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.360858 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory" (OuterVolumeSpecName: "inventory") pod "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" (UID: "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.361303 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" (UID: "c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.411361 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 18:59:49 crc kubenswrapper[4782]: E0130 18:59:49.411882 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.412717 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.412761 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.412777 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8wth\" (UniqueName: \"kubernetes.io/projected/c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566-kube-api-access-d8wth\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.644511 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" event={"ID":"c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566","Type":"ContainerDied","Data":"7d2e04b87c3bec6555074e55676844a80216bfc6a3ab2d721fcbe35a0b949bf1"} Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.644581 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2e04b87c3bec6555074e55676844a80216bfc6a3ab2d721fcbe35a0b949bf1" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.644609 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.750698 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq"] Jan 30 18:59:49 crc kubenswrapper[4782]: E0130 18:59:49.751457 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.751506 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.752007 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.753144 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.761036 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.761566 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.764580 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.764794 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq"] Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.764854 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.926864 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk762\" (UniqueName: \"kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.927433 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:49 crc kubenswrapper[4782]: I0130 18:59:49.927742 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.030037 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.030784 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.030969 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk762\" (UniqueName: \"kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.034722 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.035472 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.041720 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-bpwp5"] Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.059087 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-bpwp5"] Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.066864 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk762\" (UniqueName: \"kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-klkrq\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.087369 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.427668 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07adaf47-0b0c-46f9-bf42-fc02ffec87a4" path="/var/lib/kubelet/pods/07adaf47-0b0c-46f9-bf42-fc02ffec87a4/volumes" Jan 30 18:59:50 crc kubenswrapper[4782]: I0130 18:59:50.665241 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq"] Jan 30 18:59:51 crc kubenswrapper[4782]: I0130 18:59:51.673665 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" event={"ID":"df904ca8-14f8-4f01-b67e-be59a86d4981","Type":"ContainerStarted","Data":"5a38de04c382ffb900f3f40d0e1c5f6f96e3fcd8e2288aa9303772af4e34587b"} Jan 30 18:59:51 crc kubenswrapper[4782]: I0130 18:59:51.674120 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" event={"ID":"df904ca8-14f8-4f01-b67e-be59a86d4981","Type":"ContainerStarted","Data":"0abbedf5c4b8f6593258546233d26a7e3e7cc05de6a54c21761d87647a2998a2"} Jan 30 18:59:51 crc kubenswrapper[4782]: I0130 18:59:51.706305 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" podStartSLOduration=2.267317204 podStartE2EDuration="2.706203533s" podCreationTimestamp="2026-01-30 18:59:49 +0000 UTC" firstStartedPulling="2026-01-30 18:59:50.665439361 +0000 UTC m=+1766.933817386" lastFinishedPulling="2026-01-30 18:59:51.10432565 +0000 UTC m=+1767.372703715" observedRunningTime="2026-01-30 18:59:51.699105407 +0000 UTC m=+1767.967483472" watchObservedRunningTime="2026-01-30 18:59:51.706203533 +0000 UTC m=+1767.974581588" Jan 30 18:59:56 crc kubenswrapper[4782]: I0130 18:59:56.737630 4782 generic.go:334] "Generic (PLEG): container finished" podID="df904ca8-14f8-4f01-b67e-be59a86d4981" containerID="5a38de04c382ffb900f3f40d0e1c5f6f96e3fcd8e2288aa9303772af4e34587b" exitCode=0 Jan 30 18:59:56 crc kubenswrapper[4782]: I0130 18:59:56.737761 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" event={"ID":"df904ca8-14f8-4f01-b67e-be59a86d4981","Type":"ContainerDied","Data":"5a38de04c382ffb900f3f40d0e1c5f6f96e3fcd8e2288aa9303772af4e34587b"} Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.265950 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.432763 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory\") pod \"df904ca8-14f8-4f01-b67e-be59a86d4981\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.437542 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam\") pod \"df904ca8-14f8-4f01-b67e-be59a86d4981\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.438469 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk762\" (UniqueName: \"kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762\") pod \"df904ca8-14f8-4f01-b67e-be59a86d4981\" (UID: \"df904ca8-14f8-4f01-b67e-be59a86d4981\") " Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.445222 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762" (OuterVolumeSpecName: "kube-api-access-dk762") pod "df904ca8-14f8-4f01-b67e-be59a86d4981" (UID: "df904ca8-14f8-4f01-b67e-be59a86d4981"). InnerVolumeSpecName "kube-api-access-dk762". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.469548 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory" (OuterVolumeSpecName: "inventory") pod "df904ca8-14f8-4f01-b67e-be59a86d4981" (UID: "df904ca8-14f8-4f01-b67e-be59a86d4981"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.481353 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df904ca8-14f8-4f01-b67e-be59a86d4981" (UID: "df904ca8-14f8-4f01-b67e-be59a86d4981"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.543220 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.543288 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df904ca8-14f8-4f01-b67e-be59a86d4981-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.543304 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk762\" (UniqueName: \"kubernetes.io/projected/df904ca8-14f8-4f01-b67e-be59a86d4981-kube-api-access-dk762\") on node \"crc\" DevicePath \"\"" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.758667 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" event={"ID":"df904ca8-14f8-4f01-b67e-be59a86d4981","Type":"ContainerDied","Data":"0abbedf5c4b8f6593258546233d26a7e3e7cc05de6a54c21761d87647a2998a2"} Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.758742 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abbedf5c4b8f6593258546233d26a7e3e7cc05de6a54c21761d87647a2998a2" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.758857 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-klkrq" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.868603 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj"] Jan 30 18:59:58 crc kubenswrapper[4782]: E0130 18:59:58.869128 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df904ca8-14f8-4f01-b67e-be59a86d4981" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.869150 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="df904ca8-14f8-4f01-b67e-be59a86d4981" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.869503 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="df904ca8-14f8-4f01-b67e-be59a86d4981" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.870567 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.880748 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj"] Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.881531 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.882213 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.882604 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.882845 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.952645 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.952964 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:58 crc kubenswrapper[4782]: I0130 18:59:58.953115 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4xz6\" (UniqueName: \"kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.055899 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.056019 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4xz6\" (UniqueName: \"kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.056283 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.066786 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.069359 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.090027 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4xz6\" (UniqueName: \"kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-z68rj\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.198824 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 18:59:59 crc kubenswrapper[4782]: I0130 18:59:59.822969 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj"] Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.166007 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b"] Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.169191 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.173771 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.173999 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.179411 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b"] Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.305204 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2jv\" (UniqueName: \"kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.305491 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.305545 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.407671 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.407750 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.408034 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2jv\" (UniqueName: \"kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.409008 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.412810 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.447277 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2jv\" (UniqueName: \"kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv\") pod \"collect-profiles-29496660-7kl9b\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.506471 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.782212 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" event={"ID":"5b8169ab-3daf-43a7-a107-075317085df1","Type":"ContainerStarted","Data":"7cff08b7b81d8b23f81b896c568eea0c5c0ac26ce6ef8c355fb56005362010f4"} Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.782539 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" event={"ID":"5b8169ab-3daf-43a7-a107-075317085df1","Type":"ContainerStarted","Data":"2738fd92a8c4a8feb1d1742af4f53a22d411c8564b76655fe432d1ad90e79588"} Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.815110 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" podStartSLOduration=2.374781224 podStartE2EDuration="2.815089369s" podCreationTimestamp="2026-01-30 18:59:58 +0000 UTC" firstStartedPulling="2026-01-30 18:59:59.843297063 +0000 UTC m=+1776.111675088" lastFinishedPulling="2026-01-30 19:00:00.283605198 +0000 UTC m=+1776.551983233" observedRunningTime="2026-01-30 19:00:00.798221952 +0000 UTC m=+1777.066600007" watchObservedRunningTime="2026-01-30 19:00:00.815089369 +0000 UTC m=+1777.083467394" Jan 30 19:00:00 crc kubenswrapper[4782]: W0130 19:00:00.973642 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8992c9_9e40_46e3_9c11_d70d863f01c8.slice/crio-9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0 WatchSource:0}: Error finding container 9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0: Status 404 returned error can't find the container with id 9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0 Jan 30 19:00:00 crc kubenswrapper[4782]: I0130 19:00:00.977345 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b"] Jan 30 19:00:01 crc kubenswrapper[4782]: I0130 19:00:01.793970 4782 generic.go:334] "Generic (PLEG): container finished" podID="ad8992c9-9e40-46e3-9c11-d70d863f01c8" containerID="a19ad96b7b4446d607ed9cb46070d676a916204bbc339abdd0da171ff7b673fb" exitCode=0 Jan 30 19:00:01 crc kubenswrapper[4782]: I0130 19:00:01.794070 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" event={"ID":"ad8992c9-9e40-46e3-9c11-d70d863f01c8","Type":"ContainerDied","Data":"a19ad96b7b4446d607ed9cb46070d676a916204bbc339abdd0da171ff7b673fb"} Jan 30 19:00:01 crc kubenswrapper[4782]: I0130 19:00:01.794509 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" event={"ID":"ad8992c9-9e40-46e3-9c11-d70d863f01c8","Type":"ContainerStarted","Data":"9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0"} Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.176718 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.276318 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume\") pod \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.276621 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume\") pod \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.276703 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k2jv\" (UniqueName: \"kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv\") pod \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\" (UID: \"ad8992c9-9e40-46e3-9c11-d70d863f01c8\") " Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.277562 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad8992c9-9e40-46e3-9c11-d70d863f01c8" (UID: "ad8992c9-9e40-46e3-9c11-d70d863f01c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.285619 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv" (OuterVolumeSpecName: "kube-api-access-9k2jv") pod "ad8992c9-9e40-46e3-9c11-d70d863f01c8" (UID: "ad8992c9-9e40-46e3-9c11-d70d863f01c8"). InnerVolumeSpecName "kube-api-access-9k2jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.287495 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad8992c9-9e40-46e3-9c11-d70d863f01c8" (UID: "ad8992c9-9e40-46e3-9c11-d70d863f01c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.379313 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad8992c9-9e40-46e3-9c11-d70d863f01c8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.379360 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad8992c9-9e40-46e3-9c11-d70d863f01c8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.379379 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k2jv\" (UniqueName: \"kubernetes.io/projected/ad8992c9-9e40-46e3-9c11-d70d863f01c8-kube-api-access-9k2jv\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.820370 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" event={"ID":"ad8992c9-9e40-46e3-9c11-d70d863f01c8","Type":"ContainerDied","Data":"9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0"} Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.820417 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6b994e010f8d3064951bfff32d1aea2002b71e37b72c998f2cc73cf99a6df0" Jan 30 19:00:03 crc kubenswrapper[4782]: I0130 19:00:03.820479 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b" Jan 30 19:00:04 crc kubenswrapper[4782]: I0130 19:00:04.416494 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:00:04 crc kubenswrapper[4782]: E0130 19:00:04.416847 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:00:18 crc kubenswrapper[4782]: I0130 19:00:18.411604 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:00:18 crc kubenswrapper[4782]: E0130 19:00:18.412599 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:00:23 crc kubenswrapper[4782]: I0130 19:00:23.876077 4782 scope.go:117] "RemoveContainer" containerID="3a215ed02134533ced39cc9b6b783ff46122b85767c17dfaf3a6c8155209cfa7" Jan 30 19:00:23 crc kubenswrapper[4782]: I0130 19:00:23.951502 4782 scope.go:117] "RemoveContainer" containerID="ac7b4bd8674c634ebaaebc3c3cacd267e39e6541b520fc04a9b6032720f3ce1b" Jan 30 19:00:24 crc kubenswrapper[4782]: I0130 19:00:24.031492 4782 scope.go:117] "RemoveContainer" containerID="e065c5928b93422455efb54aab9b90775c8fff25f56b7b96a48380a896d91931" Jan 30 19:00:24 crc kubenswrapper[4782]: I0130 19:00:24.314339 4782 scope.go:117] "RemoveContainer" containerID="961c540b3814e2a30632333589d5245e623a044016634d43078e0baed9bcfd9a" Jan 30 19:00:24 crc kubenswrapper[4782]: I0130 19:00:24.356714 4782 scope.go:117] "RemoveContainer" containerID="3cb806c040408344fb39ee4e0454be212dc4c5bafa67da379f195fafca08c84b" Jan 30 19:00:24 crc kubenswrapper[4782]: I0130 19:00:24.445943 4782 scope.go:117] "RemoveContainer" containerID="e0d6b134827799f724f3cf412addb6b1b4a4ef4863d14aad089a1325736c2ff4" Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.055411 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-csd5t"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.066287 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2f36-account-create-update-5x2wl"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.078203 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mths6"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.091310 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-csd5t"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.099061 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2f36-account-create-update-5x2wl"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.137048 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mths6"] Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.422050 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bb697c-568d-47bc-abb6-56dc09be923d" path="/var/lib/kubelet/pods/04bb697c-568d-47bc-abb6-56dc09be923d/volumes" Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.423130 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5e6d06-a8c5-4789-8e21-6aba18cb8088" path="/var/lib/kubelet/pods/0d5e6d06-a8c5-4789-8e21-6aba18cb8088/volumes" Jan 30 19:00:26 crc kubenswrapper[4782]: I0130 19:00:26.423868 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233df4b9-7dfd-4817-b0fc-51db7b8d77d1" path="/var/lib/kubelet/pods/233df4b9-7dfd-4817-b0fc-51db7b8d77d1/volumes" Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.038273 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4k56s"] Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.056587 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4a3d-account-create-update-h2xvb"] Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.066714 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4k56s"] Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.078043 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4a3d-account-create-update-h2xvb"] Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.085433 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4ad3-account-create-update-t6n2b"] Jan 30 19:00:27 crc kubenswrapper[4782]: I0130 19:00:27.092180 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4ad3-account-create-update-t6n2b"] Jan 30 19:00:28 crc kubenswrapper[4782]: I0130 19:00:28.433666 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d517ad-1f63-4dc7-9893-b686884dc3d8" path="/var/lib/kubelet/pods/24d517ad-1f63-4dc7-9893-b686884dc3d8/volumes" Jan 30 19:00:28 crc kubenswrapper[4782]: I0130 19:00:28.435423 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bdb65b3-6289-42fb-8b67-290b3b72cb4f" path="/var/lib/kubelet/pods/6bdb65b3-6289-42fb-8b67-290b3b72cb4f/volumes" Jan 30 19:00:28 crc kubenswrapper[4782]: I0130 19:00:28.436577 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0185b2b-ed05-407b-93ee-3f5e83ee630a" path="/var/lib/kubelet/pods/d0185b2b-ed05-407b-93ee-3f5e83ee630a/volumes" Jan 30 19:00:31 crc kubenswrapper[4782]: I0130 19:00:31.412884 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:00:31 crc kubenswrapper[4782]: E0130 19:00:31.413497 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:00:36 crc kubenswrapper[4782]: I0130 19:00:36.205454 4782 generic.go:334] "Generic (PLEG): container finished" podID="5b8169ab-3daf-43a7-a107-075317085df1" containerID="7cff08b7b81d8b23f81b896c568eea0c5c0ac26ce6ef8c355fb56005362010f4" exitCode=0 Jan 30 19:00:36 crc kubenswrapper[4782]: I0130 19:00:36.205555 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" event={"ID":"5b8169ab-3daf-43a7-a107-075317085df1","Type":"ContainerDied","Data":"7cff08b7b81d8b23f81b896c568eea0c5c0ac26ce6ef8c355fb56005362010f4"} Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.785943 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.866627 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4xz6\" (UniqueName: \"kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6\") pod \"5b8169ab-3daf-43a7-a107-075317085df1\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.866681 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory\") pod \"5b8169ab-3daf-43a7-a107-075317085df1\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.866714 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam\") pod \"5b8169ab-3daf-43a7-a107-075317085df1\" (UID: \"5b8169ab-3daf-43a7-a107-075317085df1\") " Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.875338 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6" (OuterVolumeSpecName: "kube-api-access-t4xz6") pod "5b8169ab-3daf-43a7-a107-075317085df1" (UID: "5b8169ab-3daf-43a7-a107-075317085df1"). InnerVolumeSpecName "kube-api-access-t4xz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.897834 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5b8169ab-3daf-43a7-a107-075317085df1" (UID: "5b8169ab-3daf-43a7-a107-075317085df1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.905598 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory" (OuterVolumeSpecName: "inventory") pod "5b8169ab-3daf-43a7-a107-075317085df1" (UID: "5b8169ab-3daf-43a7-a107-075317085df1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.969325 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4xz6\" (UniqueName: \"kubernetes.io/projected/5b8169ab-3daf-43a7-a107-075317085df1-kube-api-access-t4xz6\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.969366 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:37 crc kubenswrapper[4782]: I0130 19:00:37.969380 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b8169ab-3daf-43a7-a107-075317085df1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.230613 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" event={"ID":"5b8169ab-3daf-43a7-a107-075317085df1","Type":"ContainerDied","Data":"2738fd92a8c4a8feb1d1742af4f53a22d411c8564b76655fe432d1ad90e79588"} Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.230930 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2738fd92a8c4a8feb1d1742af4f53a22d411c8564b76655fe432d1ad90e79588" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.230697 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-z68rj" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.347638 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk"] Jan 30 19:00:38 crc kubenswrapper[4782]: E0130 19:00:38.348268 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8169ab-3daf-43a7-a107-075317085df1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.348299 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8169ab-3daf-43a7-a107-075317085df1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:00:38 crc kubenswrapper[4782]: E0130 19:00:38.348349 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8992c9-9e40-46e3-9c11-d70d863f01c8" containerName="collect-profiles" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.348363 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8992c9-9e40-46e3-9c11-d70d863f01c8" containerName="collect-profiles" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.348667 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8169ab-3daf-43a7-a107-075317085df1" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.348707 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8992c9-9e40-46e3-9c11-d70d863f01c8" containerName="collect-profiles" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.349831 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.355273 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.355335 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.355396 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.355461 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.369271 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk"] Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.478242 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.478733 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qml7k\" (UniqueName: \"kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.478856 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.580765 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.580845 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qml7k\" (UniqueName: \"kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.581696 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.585754 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.592756 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.611085 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qml7k\" (UniqueName: \"kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:38 crc kubenswrapper[4782]: I0130 19:00:38.679218 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:00:39 crc kubenswrapper[4782]: I0130 19:00:39.298628 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk"] Jan 30 19:00:39 crc kubenswrapper[4782]: W0130 19:00:39.299256 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9396100_4e8e_4e30_af8c_82043b59d08d.slice/crio-9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9 WatchSource:0}: Error finding container 9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9: Status 404 returned error can't find the container with id 9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9 Jan 30 19:00:40 crc kubenswrapper[4782]: I0130 19:00:40.261412 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" event={"ID":"f9396100-4e8e-4e30-af8c-82043b59d08d","Type":"ContainerStarted","Data":"23d18b8866b8e5e39fe77f13394e83dccf9dbca55ecfafc1078f0ba548809479"} Jan 30 19:00:40 crc kubenswrapper[4782]: I0130 19:00:40.261741 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" event={"ID":"f9396100-4e8e-4e30-af8c-82043b59d08d","Type":"ContainerStarted","Data":"9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9"} Jan 30 19:00:40 crc kubenswrapper[4782]: I0130 19:00:40.294346 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" podStartSLOduration=1.87828157 podStartE2EDuration="2.294324255s" podCreationTimestamp="2026-01-30 19:00:38 +0000 UTC" firstStartedPulling="2026-01-30 19:00:39.303314273 +0000 UTC m=+1815.571692298" lastFinishedPulling="2026-01-30 19:00:39.719356918 +0000 UTC m=+1815.987734983" observedRunningTime="2026-01-30 19:00:40.288973322 +0000 UTC m=+1816.557351357" watchObservedRunningTime="2026-01-30 19:00:40.294324255 +0000 UTC m=+1816.562702290" Jan 30 19:00:43 crc kubenswrapper[4782]: I0130 19:00:43.411189 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:00:43 crc kubenswrapper[4782]: E0130 19:00:43.412309 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:00:54 crc kubenswrapper[4782]: I0130 19:00:54.418576 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:00:54 crc kubenswrapper[4782]: E0130 19:00:54.419565 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:00:58 crc kubenswrapper[4782]: I0130 19:00:58.072880 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsq8m"] Jan 30 19:00:58 crc kubenswrapper[4782]: I0130 19:00:58.079708 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wsq8m"] Jan 30 19:00:58 crc kubenswrapper[4782]: I0130 19:00:58.439491 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616ca793-5768-474b-b80c-c29026c68bd6" path="/var/lib/kubelet/pods/616ca793-5768-474b-b80c-c29026c68bd6/volumes" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.148621 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29496661-m5ftp"] Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.150536 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.155766 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.155921 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tplmr\" (UniqueName: \"kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.156316 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.156443 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.178138 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496661-m5ftp"] Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.258413 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tplmr\" (UniqueName: \"kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.258783 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.258860 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.258926 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.264885 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.265525 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.267853 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.286482 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tplmr\" (UniqueName: \"kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr\") pod \"keystone-cron-29496661-m5ftp\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:00 crc kubenswrapper[4782]: I0130 19:01:00.521971 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:01 crc kubenswrapper[4782]: I0130 19:01:01.047719 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496661-m5ftp"] Jan 30 19:01:01 crc kubenswrapper[4782]: I0130 19:01:01.522222 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496661-m5ftp" event={"ID":"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2","Type":"ContainerStarted","Data":"2aac39d5b32c530afd870e96901382c14a82e522d3eeb3dba54cb4bcff596801"} Jan 30 19:01:01 crc kubenswrapper[4782]: I0130 19:01:01.522736 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496661-m5ftp" event={"ID":"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2","Type":"ContainerStarted","Data":"c6299172c323128967dc4f6d145dbf38313dde59ab3657272c76acea34d13b23"} Jan 30 19:01:01 crc kubenswrapper[4782]: I0130 19:01:01.552525 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29496661-m5ftp" podStartSLOduration=1.5524900769999999 podStartE2EDuration="1.552490077s" podCreationTimestamp="2026-01-30 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 19:01:01.539169797 +0000 UTC m=+1837.807547852" watchObservedRunningTime="2026-01-30 19:01:01.552490077 +0000 UTC m=+1837.820868152" Jan 30 19:01:04 crc kubenswrapper[4782]: I0130 19:01:04.558507 4782 generic.go:334] "Generic (PLEG): container finished" podID="dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" containerID="2aac39d5b32c530afd870e96901382c14a82e522d3eeb3dba54cb4bcff596801" exitCode=0 Jan 30 19:01:04 crc kubenswrapper[4782]: I0130 19:01:04.558666 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496661-m5ftp" event={"ID":"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2","Type":"ContainerDied","Data":"2aac39d5b32c530afd870e96901382c14a82e522d3eeb3dba54cb4bcff596801"} Jan 30 19:01:05 crc kubenswrapper[4782]: I0130 19:01:05.944111 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:05 crc kubenswrapper[4782]: I0130 19:01:05.996970 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys\") pod \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " Jan 30 19:01:05 crc kubenswrapper[4782]: I0130 19:01:05.997734 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data\") pod \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " Jan 30 19:01:05 crc kubenswrapper[4782]: I0130 19:01:05.997913 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tplmr\" (UniqueName: \"kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr\") pod \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " Jan 30 19:01:05 crc kubenswrapper[4782]: I0130 19:01:05.998115 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle\") pod \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\" (UID: \"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2\") " Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.004013 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" (UID: "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.005505 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr" (OuterVolumeSpecName: "kube-api-access-tplmr") pod "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" (UID: "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2"). InnerVolumeSpecName "kube-api-access-tplmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.028954 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" (UID: "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.046581 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data" (OuterVolumeSpecName: "config-data") pod "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" (UID: "dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.101562 4782 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.101606 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tplmr\" (UniqueName: \"kubernetes.io/projected/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-kube-api-access-tplmr\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.101621 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.101633 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.587945 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496661-m5ftp" event={"ID":"dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2","Type":"ContainerDied","Data":"c6299172c323128967dc4f6d145dbf38313dde59ab3657272c76acea34d13b23"} Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.588444 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6299172c323128967dc4f6d145dbf38313dde59ab3657272c76acea34d13b23" Jan 30 19:01:06 crc kubenswrapper[4782]: I0130 19:01:06.588068 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496661-m5ftp" Jan 30 19:01:09 crc kubenswrapper[4782]: I0130 19:01:09.410987 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:01:09 crc kubenswrapper[4782]: E0130 19:01:09.411453 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:01:23 crc kubenswrapper[4782]: I0130 19:01:23.541901 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:01:23 crc kubenswrapper[4782]: E0130 19:01:23.542851 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.604719 4782 scope.go:117] "RemoveContainer" containerID="29010e56782893fa617a8e2c0cd0c5a188f249db8d97bd95cd27e576509b2873" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.641652 4782 scope.go:117] "RemoveContainer" containerID="19a09d398e1dd3e57b06678a11dd97bbfe9cdf78a150eea9ea7b8d0b3a907745" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.707739 4782 scope.go:117] "RemoveContainer" containerID="80d8d94362d353c3ee5e16fb0223a49de80f56dcdac5824fda2ec39c90cb2bd3" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.770541 4782 scope.go:117] "RemoveContainer" containerID="7a45e10cc336b44091367d42fa0e6c048a33f1597edc1ca8f69759f18364baa5" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.847478 4782 scope.go:117] "RemoveContainer" containerID="8c2546defb9d2bc5948dce22e653318e7e09cdfcfdad5b9843d942651fe66f78" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.879682 4782 scope.go:117] "RemoveContainer" containerID="6b13dfb69af04f63ac75806c7c8b1e99a29a7bc41f24744cc29ba67a5daa914a" Jan 30 19:01:24 crc kubenswrapper[4782]: I0130 19:01:24.914414 4782 scope.go:117] "RemoveContainer" containerID="b6d3219b7f40a2655f11e2a18a6aedf81bf3a961c4d7ecf9036790423358269e" Jan 30 19:01:27 crc kubenswrapper[4782]: I0130 19:01:27.057642 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xjjp9"] Jan 30 19:01:27 crc kubenswrapper[4782]: I0130 19:01:27.068378 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xjjp9"] Jan 30 19:01:28 crc kubenswrapper[4782]: I0130 19:01:28.424824 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485cb541-e79c-474e-b68d-34e10ee57480" path="/var/lib/kubelet/pods/485cb541-e79c-474e-b68d-34e10ee57480/volumes" Jan 30 19:01:28 crc kubenswrapper[4782]: I0130 19:01:28.870053 4782 generic.go:334] "Generic (PLEG): container finished" podID="f9396100-4e8e-4e30-af8c-82043b59d08d" containerID="23d18b8866b8e5e39fe77f13394e83dccf9dbca55ecfafc1078f0ba548809479" exitCode=0 Jan 30 19:01:28 crc kubenswrapper[4782]: I0130 19:01:28.870118 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" event={"ID":"f9396100-4e8e-4e30-af8c-82043b59d08d","Type":"ContainerDied","Data":"23d18b8866b8e5e39fe77f13394e83dccf9dbca55ecfafc1078f0ba548809479"} Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.423103 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.502687 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qml7k\" (UniqueName: \"kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k\") pod \"f9396100-4e8e-4e30-af8c-82043b59d08d\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.502816 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam\") pod \"f9396100-4e8e-4e30-af8c-82043b59d08d\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.502861 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory\") pod \"f9396100-4e8e-4e30-af8c-82043b59d08d\" (UID: \"f9396100-4e8e-4e30-af8c-82043b59d08d\") " Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.521538 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k" (OuterVolumeSpecName: "kube-api-access-qml7k") pod "f9396100-4e8e-4e30-af8c-82043b59d08d" (UID: "f9396100-4e8e-4e30-af8c-82043b59d08d"). InnerVolumeSpecName "kube-api-access-qml7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.547818 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory" (OuterVolumeSpecName: "inventory") pod "f9396100-4e8e-4e30-af8c-82043b59d08d" (UID: "f9396100-4e8e-4e30-af8c-82043b59d08d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.549644 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9396100-4e8e-4e30-af8c-82043b59d08d" (UID: "f9396100-4e8e-4e30-af8c-82043b59d08d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.605267 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qml7k\" (UniqueName: \"kubernetes.io/projected/f9396100-4e8e-4e30-af8c-82043b59d08d-kube-api-access-qml7k\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.605730 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.605752 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9396100-4e8e-4e30-af8c-82043b59d08d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.893733 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" event={"ID":"f9396100-4e8e-4e30-af8c-82043b59d08d","Type":"ContainerDied","Data":"9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9"} Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.893803 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9790f85664d1e4acd3dc5a025026c7ab541e8139084ea8966d5633b6daa2d9e9" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.894043 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.999136 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mvw5c"] Jan 30 19:01:30 crc kubenswrapper[4782]: E0130 19:01:30.999531 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" containerName="keystone-cron" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.999549 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" containerName="keystone-cron" Jan 30 19:01:30 crc kubenswrapper[4782]: E0130 19:01:30.999562 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9396100-4e8e-4e30-af8c-82043b59d08d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.999571 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9396100-4e8e-4e30-af8c-82043b59d08d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.999740 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9396100-4e8e-4e30-af8c-82043b59d08d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:30 crc kubenswrapper[4782]: I0130 19:01:30.999759 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2" containerName="keystone-cron" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.000431 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.003682 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.003892 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.003948 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.004094 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.019573 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mvw5c"] Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.077797 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8ht9p"] Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.088107 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8ht9p"] Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.115074 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.115221 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.115334 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r768t\" (UniqueName: \"kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.216968 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.217075 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.217125 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r768t\" (UniqueName: \"kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.224994 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.225455 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.240291 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r768t\" (UniqueName: \"kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t\") pod \"ssh-known-hosts-edpm-deployment-mvw5c\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.331011 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.849962 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mvw5c"] Jan 30 19:01:31 crc kubenswrapper[4782]: W0130 19:01:31.858599 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cceb6ea_381a_4862_bbff_42f7ce3cbaf4.slice/crio-ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d WatchSource:0}: Error finding container ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d: Status 404 returned error can't find the container with id ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d Jan 30 19:01:31 crc kubenswrapper[4782]: I0130 19:01:31.901673 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" event={"ID":"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4","Type":"ContainerStarted","Data":"ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d"} Jan 30 19:01:32 crc kubenswrapper[4782]: I0130 19:01:32.424324 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c429234-8185-4737-91f8-a65403cc83d2" path="/var/lib/kubelet/pods/7c429234-8185-4737-91f8-a65403cc83d2/volumes" Jan 30 19:01:32 crc kubenswrapper[4782]: I0130 19:01:32.914036 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" event={"ID":"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4","Type":"ContainerStarted","Data":"2637a62d5a57c0aad11ca3ea5094aab4bb10d2ab103a5bae5e09dbd60d1f36a7"} Jan 30 19:01:32 crc kubenswrapper[4782]: I0130 19:01:32.942253 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" podStartSLOduration=2.512767383 podStartE2EDuration="2.9422083s" podCreationTimestamp="2026-01-30 19:01:30 +0000 UTC" firstStartedPulling="2026-01-30 19:01:31.86112984 +0000 UTC m=+1868.129507865" lastFinishedPulling="2026-01-30 19:01:32.290570717 +0000 UTC m=+1868.558948782" observedRunningTime="2026-01-30 19:01:32.941839541 +0000 UTC m=+1869.210217606" watchObservedRunningTime="2026-01-30 19:01:32.9422083 +0000 UTC m=+1869.210586325" Jan 30 19:01:38 crc kubenswrapper[4782]: I0130 19:01:38.411413 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:01:38 crc kubenswrapper[4782]: E0130 19:01:38.412263 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:01:40 crc kubenswrapper[4782]: I0130 19:01:40.001112 4782 generic.go:334] "Generic (PLEG): container finished" podID="0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" containerID="2637a62d5a57c0aad11ca3ea5094aab4bb10d2ab103a5bae5e09dbd60d1f36a7" exitCode=0 Jan 30 19:01:40 crc kubenswrapper[4782]: I0130 19:01:40.001539 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" event={"ID":"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4","Type":"ContainerDied","Data":"2637a62d5a57c0aad11ca3ea5094aab4bb10d2ab103a5bae5e09dbd60d1f36a7"} Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.431976 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.452981 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r768t\" (UniqueName: \"kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t\") pod \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.453050 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0\") pod \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.453309 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam\") pod \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\" (UID: \"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4\") " Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.459171 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t" (OuterVolumeSpecName: "kube-api-access-r768t") pod "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" (UID: "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4"). InnerVolumeSpecName "kube-api-access-r768t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.487129 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" (UID: "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.496485 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" (UID: "0cceb6ea-381a-4862-bbff-42f7ce3cbaf4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.557399 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r768t\" (UniqueName: \"kubernetes.io/projected/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-kube-api-access-r768t\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.557476 4782 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:41 crc kubenswrapper[4782]: I0130 19:01:41.557501 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0cceb6ea-381a-4862-bbff-42f7ce3cbaf4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.024475 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" event={"ID":"0cceb6ea-381a-4862-bbff-42f7ce3cbaf4","Type":"ContainerDied","Data":"ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d"} Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.024514 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee8587b3ecb9d23a0db86f73151f303874785486b114aa09a21e6e92e20be26d" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.024541 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mvw5c" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.128813 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m"] Jan 30 19:01:42 crc kubenswrapper[4782]: E0130 19:01:42.129306 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" containerName="ssh-known-hosts-edpm-deployment" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.129326 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" containerName="ssh-known-hosts-edpm-deployment" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.129619 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cceb6ea-381a-4862-bbff-42f7ce3cbaf4" containerName="ssh-known-hosts-edpm-deployment" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.130425 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.133819 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.134110 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.135604 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.137330 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.141477 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m"] Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.176666 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.176833 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkfw\" (UniqueName: \"kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.176877 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.279173 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.279874 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkfw\" (UniqueName: \"kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.279948 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.282767 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.283166 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.298412 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkfw\" (UniqueName: \"kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7st4m\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.449343 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:42 crc kubenswrapper[4782]: I0130 19:01:42.991945 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m"] Jan 30 19:01:43 crc kubenswrapper[4782]: I0130 19:01:43.002096 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:01:43 crc kubenswrapper[4782]: I0130 19:01:43.037703 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" event={"ID":"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3","Type":"ContainerStarted","Data":"118f65edfa8738fac6a252bad7dc85f8f42e49e6f829370f0be09020284bc5e3"} Jan 30 19:01:44 crc kubenswrapper[4782]: I0130 19:01:44.047936 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" event={"ID":"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3","Type":"ContainerStarted","Data":"28939f0714ad5612740711fbf87472651970b6884b4453106dbcc2bc3fc10d42"} Jan 30 19:01:44 crc kubenswrapper[4782]: I0130 19:01:44.070663 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" podStartSLOduration=1.6130973210000001 podStartE2EDuration="2.070646336s" podCreationTimestamp="2026-01-30 19:01:42 +0000 UTC" firstStartedPulling="2026-01-30 19:01:43.001859112 +0000 UTC m=+1879.270237147" lastFinishedPulling="2026-01-30 19:01:43.459408137 +0000 UTC m=+1879.727786162" observedRunningTime="2026-01-30 19:01:44.063336125 +0000 UTC m=+1880.331714150" watchObservedRunningTime="2026-01-30 19:01:44.070646336 +0000 UTC m=+1880.339024361" Jan 30 19:01:51 crc kubenswrapper[4782]: I0130 19:01:51.411817 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:01:51 crc kubenswrapper[4782]: E0130 19:01:51.412911 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:01:52 crc kubenswrapper[4782]: I0130 19:01:52.139414 4782 generic.go:334] "Generic (PLEG): container finished" podID="f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" containerID="28939f0714ad5612740711fbf87472651970b6884b4453106dbcc2bc3fc10d42" exitCode=0 Jan 30 19:01:52 crc kubenswrapper[4782]: I0130 19:01:52.139490 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" event={"ID":"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3","Type":"ContainerDied","Data":"28939f0714ad5612740711fbf87472651970b6884b4453106dbcc2bc3fc10d42"} Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.729912 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.846550 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam\") pod \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.846622 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory\") pod \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.846702 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkfw\" (UniqueName: \"kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw\") pod \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\" (UID: \"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3\") " Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.852502 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw" (OuterVolumeSpecName: "kube-api-access-6dkfw") pod "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" (UID: "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3"). InnerVolumeSpecName "kube-api-access-6dkfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.874538 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" (UID: "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.877671 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory" (OuterVolumeSpecName: "inventory") pod "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" (UID: "f287af23-a5f5-4aa9-b9c2-9cd87fc26da3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.948682 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.948724 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:53 crc kubenswrapper[4782]: I0130 19:01:53.948738 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkfw\" (UniqueName: \"kubernetes.io/projected/f287af23-a5f5-4aa9-b9c2-9cd87fc26da3-kube-api-access-6dkfw\") on node \"crc\" DevicePath \"\"" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.163089 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" event={"ID":"f287af23-a5f5-4aa9-b9c2-9cd87fc26da3","Type":"ContainerDied","Data":"118f65edfa8738fac6a252bad7dc85f8f42e49e6f829370f0be09020284bc5e3"} Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.163152 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="118f65edfa8738fac6a252bad7dc85f8f42e49e6f829370f0be09020284bc5e3" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.163181 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7st4m" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.270371 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh"] Jan 30 19:01:54 crc kubenswrapper[4782]: E0130 19:01:54.270791 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.270808 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.271006 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f287af23-a5f5-4aa9-b9c2-9cd87fc26da3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.271715 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.274523 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.274659 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.274773 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.282105 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.285570 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh"] Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.357543 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8cv\" (UniqueName: \"kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.357592 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.357643 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.460117 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8cv\" (UniqueName: \"kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.460197 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.460280 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.463714 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.464625 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.481947 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8cv\" (UniqueName: \"kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:54 crc kubenswrapper[4782]: I0130 19:01:54.603086 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:01:55 crc kubenswrapper[4782]: I0130 19:01:55.201301 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh"] Jan 30 19:01:56 crc kubenswrapper[4782]: I0130 19:01:56.193311 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" event={"ID":"385f85fe-f3e6-4149-9241-ae72c3e9d52d","Type":"ContainerStarted","Data":"85df990a02e8b33bc5f5891ca98c365c17753846f15bdfce09d43041dde40028"} Jan 30 19:01:56 crc kubenswrapper[4782]: I0130 19:01:56.193627 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" event={"ID":"385f85fe-f3e6-4149-9241-ae72c3e9d52d","Type":"ContainerStarted","Data":"ebfe9e228d45a1c41ac8ce210b3a0b20c12012436b4a9e7490e6e942658cc693"} Jan 30 19:01:56 crc kubenswrapper[4782]: I0130 19:01:56.231565 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" podStartSLOduration=1.7182637330000001 podStartE2EDuration="2.231540928s" podCreationTimestamp="2026-01-30 19:01:54 +0000 UTC" firstStartedPulling="2026-01-30 19:01:55.192435976 +0000 UTC m=+1891.460814001" lastFinishedPulling="2026-01-30 19:01:55.705713161 +0000 UTC m=+1891.974091196" observedRunningTime="2026-01-30 19:01:56.215598693 +0000 UTC m=+1892.483976728" watchObservedRunningTime="2026-01-30 19:01:56.231540928 +0000 UTC m=+1892.499918993" Jan 30 19:02:05 crc kubenswrapper[4782]: I0130 19:02:05.297171 4782 generic.go:334] "Generic (PLEG): container finished" podID="385f85fe-f3e6-4149-9241-ae72c3e9d52d" containerID="85df990a02e8b33bc5f5891ca98c365c17753846f15bdfce09d43041dde40028" exitCode=0 Jan 30 19:02:05 crc kubenswrapper[4782]: I0130 19:02:05.297304 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" event={"ID":"385f85fe-f3e6-4149-9241-ae72c3e9d52d","Type":"ContainerDied","Data":"85df990a02e8b33bc5f5891ca98c365c17753846f15bdfce09d43041dde40028"} Jan 30 19:02:05 crc kubenswrapper[4782]: I0130 19:02:05.411690 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:02:05 crc kubenswrapper[4782]: E0130 19:02:05.412134 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.822489 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.920137 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory\") pod \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.920279 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam\") pod \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.920378 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw8cv\" (UniqueName: \"kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv\") pod \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\" (UID: \"385f85fe-f3e6-4149-9241-ae72c3e9d52d\") " Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.928982 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv" (OuterVolumeSpecName: "kube-api-access-xw8cv") pod "385f85fe-f3e6-4149-9241-ae72c3e9d52d" (UID: "385f85fe-f3e6-4149-9241-ae72c3e9d52d"). InnerVolumeSpecName "kube-api-access-xw8cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.963006 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory" (OuterVolumeSpecName: "inventory") pod "385f85fe-f3e6-4149-9241-ae72c3e9d52d" (UID: "385f85fe-f3e6-4149-9241-ae72c3e9d52d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:06 crc kubenswrapper[4782]: I0130 19:02:06.976539 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "385f85fe-f3e6-4149-9241-ae72c3e9d52d" (UID: "385f85fe-f3e6-4149-9241-ae72c3e9d52d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.035589 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.035645 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw8cv\" (UniqueName: \"kubernetes.io/projected/385f85fe-f3e6-4149-9241-ae72c3e9d52d-kube-api-access-xw8cv\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.035677 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/385f85fe-f3e6-4149-9241-ae72c3e9d52d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.319622 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" event={"ID":"385f85fe-f3e6-4149-9241-ae72c3e9d52d","Type":"ContainerDied","Data":"ebfe9e228d45a1c41ac8ce210b3a0b20c12012436b4a9e7490e6e942658cc693"} Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.319695 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebfe9e228d45a1c41ac8ce210b3a0b20c12012436b4a9e7490e6e942658cc693" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.319743 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.427213 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58"] Jan 30 19:02:07 crc kubenswrapper[4782]: E0130 19:02:07.427697 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385f85fe-f3e6-4149-9241-ae72c3e9d52d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.427727 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="385f85fe-f3e6-4149-9241-ae72c3e9d52d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.428014 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="385f85fe-f3e6-4149-9241-ae72c3e9d52d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.428996 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.431643 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.431740 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.431911 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.432026 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.432838 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.433312 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.433914 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.435801 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.454414 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58"] Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545132 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545188 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545395 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545436 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545468 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.545780 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546001 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546159 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546380 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99v7w\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546455 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546623 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546787 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546859 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.546919 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649134 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649596 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649644 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649782 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649858 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649920 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.649973 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99v7w\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650013 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650062 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650106 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650139 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650176 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650266 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.650310 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.654616 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.658877 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.660949 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.664445 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.665642 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.665666 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.665963 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.666556 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.667375 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.668301 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.668612 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.668704 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.670203 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.673450 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99v7w\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zkx58\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:07 crc kubenswrapper[4782]: I0130 19:02:07.754713 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:08 crc kubenswrapper[4782]: I0130 19:02:08.419515 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58"] Jan 30 19:02:09 crc kubenswrapper[4782]: I0130 19:02:09.343252 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" event={"ID":"a4f9b344-67a5-4f16-99b1-d8402f3e44cb","Type":"ContainerStarted","Data":"5d04b271eaff7d5a0c1e3a2a1b2c8f236ca6da7c04410ea353bdd1464ae3e7ed"} Jan 30 19:02:09 crc kubenswrapper[4782]: I0130 19:02:09.343596 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" event={"ID":"a4f9b344-67a5-4f16-99b1-d8402f3e44cb","Type":"ContainerStarted","Data":"fafef45d1b51fe3411d01f316fc166433bfc54bafed50afca4fca88fe064afd7"} Jan 30 19:02:09 crc kubenswrapper[4782]: I0130 19:02:09.371324 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" podStartSLOduration=1.891949933 podStartE2EDuration="2.371305929s" podCreationTimestamp="2026-01-30 19:02:07 +0000 UTC" firstStartedPulling="2026-01-30 19:02:08.421831116 +0000 UTC m=+1904.690209161" lastFinishedPulling="2026-01-30 19:02:08.901187092 +0000 UTC m=+1905.169565157" observedRunningTime="2026-01-30 19:02:09.366567162 +0000 UTC m=+1905.634945197" watchObservedRunningTime="2026-01-30 19:02:09.371305929 +0000 UTC m=+1905.639683964" Jan 30 19:02:13 crc kubenswrapper[4782]: I0130 19:02:13.055533 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xwqx2"] Jan 30 19:02:13 crc kubenswrapper[4782]: I0130 19:02:13.070511 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xwqx2"] Jan 30 19:02:14 crc kubenswrapper[4782]: I0130 19:02:14.426690 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18a53ee-3844-44b6-b8f2-149dd7b6f725" path="/var/lib/kubelet/pods/d18a53ee-3844-44b6-b8f2-149dd7b6f725/volumes" Jan 30 19:02:18 crc kubenswrapper[4782]: I0130 19:02:18.410525 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:02:18 crc kubenswrapper[4782]: E0130 19:02:18.411065 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:02:25 crc kubenswrapper[4782]: I0130 19:02:25.086811 4782 scope.go:117] "RemoveContainer" containerID="0e9cacaa7b0cdad45cfb0edefec7a15f4c4225bee109c997a5cd0aee4813d2c1" Jan 30 19:02:25 crc kubenswrapper[4782]: I0130 19:02:25.170622 4782 scope.go:117] "RemoveContainer" containerID="34149667919c372d1708d52f8f7aa8cc0e64355b75cd2e4670631352cbe443b4" Jan 30 19:02:25 crc kubenswrapper[4782]: I0130 19:02:25.227834 4782 scope.go:117] "RemoveContainer" containerID="74e73487c938481085f5a683833434bb42ac064614962d8c5a80d9ea10221a76" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.487399 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.491798 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.509873 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.632599 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.632787 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8dhg\" (UniqueName: \"kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.632830 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.734317 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8dhg\" (UniqueName: \"kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.734393 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.734515 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.735337 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.735395 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.771139 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8dhg\" (UniqueName: \"kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg\") pod \"redhat-marketplace-ffs6f\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:28 crc kubenswrapper[4782]: I0130 19:02:28.828146 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:29 crc kubenswrapper[4782]: I0130 19:02:29.282617 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:29 crc kubenswrapper[4782]: W0130 19:02:29.290918 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f1adff4_a3c6_4cc8_8d5b_f1d5b93242f6.slice/crio-5252b663db9a712d89486fce92683b86881fd0e1f061d446e20b96dbd2f69a60 WatchSource:0}: Error finding container 5252b663db9a712d89486fce92683b86881fd0e1f061d446e20b96dbd2f69a60: Status 404 returned error can't find the container with id 5252b663db9a712d89486fce92683b86881fd0e1f061d446e20b96dbd2f69a60 Jan 30 19:02:29 crc kubenswrapper[4782]: I0130 19:02:29.582852 4782 generic.go:334] "Generic (PLEG): container finished" podID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerID="36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1" exitCode=0 Jan 30 19:02:29 crc kubenswrapper[4782]: I0130 19:02:29.582910 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerDied","Data":"36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1"} Jan 30 19:02:29 crc kubenswrapper[4782]: I0130 19:02:29.582991 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerStarted","Data":"5252b663db9a712d89486fce92683b86881fd0e1f061d446e20b96dbd2f69a60"} Jan 30 19:02:31 crc kubenswrapper[4782]: I0130 19:02:31.612089 4782 generic.go:334] "Generic (PLEG): container finished" podID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerID="9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff" exitCode=0 Jan 30 19:02:31 crc kubenswrapper[4782]: I0130 19:02:31.612188 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerDied","Data":"9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff"} Jan 30 19:02:32 crc kubenswrapper[4782]: I0130 19:02:32.624155 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerStarted","Data":"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366"} Jan 30 19:02:32 crc kubenswrapper[4782]: I0130 19:02:32.645775 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ffs6f" podStartSLOduration=2.179131168 podStartE2EDuration="4.645755317s" podCreationTimestamp="2026-01-30 19:02:28 +0000 UTC" firstStartedPulling="2026-01-30 19:02:29.584953859 +0000 UTC m=+1925.853331894" lastFinishedPulling="2026-01-30 19:02:32.051578018 +0000 UTC m=+1928.319956043" observedRunningTime="2026-01-30 19:02:32.642435384 +0000 UTC m=+1928.910813429" watchObservedRunningTime="2026-01-30 19:02:32.645755317 +0000 UTC m=+1928.914133352" Jan 30 19:02:33 crc kubenswrapper[4782]: I0130 19:02:33.410820 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:02:33 crc kubenswrapper[4782]: E0130 19:02:33.411125 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:02:38 crc kubenswrapper[4782]: I0130 19:02:38.828882 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:38 crc kubenswrapper[4782]: I0130 19:02:38.829454 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:38 crc kubenswrapper[4782]: I0130 19:02:38.875767 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:39 crc kubenswrapper[4782]: I0130 19:02:39.766719 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:39 crc kubenswrapper[4782]: I0130 19:02:39.832452 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:41 crc kubenswrapper[4782]: I0130 19:02:41.711927 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ffs6f" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="registry-server" containerID="cri-o://d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366" gracePeriod=2 Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.183608 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.266142 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities\") pod \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.266255 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content\") pod \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.266288 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8dhg\" (UniqueName: \"kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg\") pod \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\" (UID: \"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6\") " Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.268642 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities" (OuterVolumeSpecName: "utilities") pod "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" (UID: "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.274826 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg" (OuterVolumeSpecName: "kube-api-access-j8dhg") pod "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" (UID: "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6"). InnerVolumeSpecName "kube-api-access-j8dhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.294726 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" (UID: "3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.369327 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.369386 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.369419 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8dhg\" (UniqueName: \"kubernetes.io/projected/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6-kube-api-access-j8dhg\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.722012 4782 generic.go:334] "Generic (PLEG): container finished" podID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerID="d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366" exitCode=0 Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.722060 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerDied","Data":"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366"} Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.722098 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffs6f" event={"ID":"3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6","Type":"ContainerDied","Data":"5252b663db9a712d89486fce92683b86881fd0e1f061d446e20b96dbd2f69a60"} Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.722119 4782 scope.go:117] "RemoveContainer" containerID="d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.722180 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffs6f" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.743518 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.750159 4782 scope.go:117] "RemoveContainer" containerID="9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.750964 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffs6f"] Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.787830 4782 scope.go:117] "RemoveContainer" containerID="36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.820588 4782 scope.go:117] "RemoveContainer" containerID="d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366" Jan 30 19:02:42 crc kubenswrapper[4782]: E0130 19:02:42.821283 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366\": container with ID starting with d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366 not found: ID does not exist" containerID="d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.821330 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366"} err="failed to get container status \"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366\": rpc error: code = NotFound desc = could not find container \"d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366\": container with ID starting with d35a5d8c7e2dabd14051cb1d7fbdc9805e8e6f4925fdb843195be3d5a6f88366 not found: ID does not exist" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.821358 4782 scope.go:117] "RemoveContainer" containerID="9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff" Jan 30 19:02:42 crc kubenswrapper[4782]: E0130 19:02:42.821738 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff\": container with ID starting with 9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff not found: ID does not exist" containerID="9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.821822 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff"} err="failed to get container status \"9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff\": rpc error: code = NotFound desc = could not find container \"9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff\": container with ID starting with 9d454d2d36b5769a2834e4911dcf6dd6b1f73b3f692b4b20e36e39f4c7ae3bff not found: ID does not exist" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.821872 4782 scope.go:117] "RemoveContainer" containerID="36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1" Jan 30 19:02:42 crc kubenswrapper[4782]: E0130 19:02:42.822302 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1\": container with ID starting with 36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1 not found: ID does not exist" containerID="36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1" Jan 30 19:02:42 crc kubenswrapper[4782]: I0130 19:02:42.822332 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1"} err="failed to get container status \"36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1\": rpc error: code = NotFound desc = could not find container \"36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1\": container with ID starting with 36568b2ca699fd2c81168a8a50dc4f7f97fc5b2af82ff2c5d62668f05d010cd1 not found: ID does not exist" Jan 30 19:02:44 crc kubenswrapper[4782]: I0130 19:02:44.420028 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:02:44 crc kubenswrapper[4782]: E0130 19:02:44.422420 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:02:44 crc kubenswrapper[4782]: I0130 19:02:44.422786 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" path="/var/lib/kubelet/pods/3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6/volumes" Jan 30 19:02:47 crc kubenswrapper[4782]: I0130 19:02:47.780184 4782 generic.go:334] "Generic (PLEG): container finished" podID="a4f9b344-67a5-4f16-99b1-d8402f3e44cb" containerID="5d04b271eaff7d5a0c1e3a2a1b2c8f236ca6da7c04410ea353bdd1464ae3e7ed" exitCode=0 Jan 30 19:02:47 crc kubenswrapper[4782]: I0130 19:02:47.780299 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" event={"ID":"a4f9b344-67a5-4f16-99b1-d8402f3e44cb","Type":"ContainerDied","Data":"5d04b271eaff7d5a0c1e3a2a1b2c8f236ca6da7c04410ea353bdd1464ae3e7ed"} Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.305135 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.427952 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99v7w\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428436 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428473 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428514 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428578 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428623 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428697 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428720 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428744 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428786 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428810 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428893 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428911 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.428933 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory\") pod \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\" (UID: \"a4f9b344-67a5-4f16-99b1-d8402f3e44cb\") " Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.436565 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.436643 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.437416 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.437453 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.437494 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.438153 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.438540 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.439545 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w" (OuterVolumeSpecName: "kube-api-access-99v7w") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "kube-api-access-99v7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.439897 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.440939 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.441978 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.441988 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.465868 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.476590 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory" (OuterVolumeSpecName: "inventory") pod "a4f9b344-67a5-4f16-99b1-d8402f3e44cb" (UID: "a4f9b344-67a5-4f16-99b1-d8402f3e44cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531807 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531876 4782 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531900 4782 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531922 4782 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531942 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531962 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.531982 4782 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532002 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532021 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532042 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532059 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99v7w\" (UniqueName: \"kubernetes.io/projected/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-kube-api-access-99v7w\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532077 4782 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532095 4782 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.532113 4782 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f9b344-67a5-4f16-99b1-d8402f3e44cb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.813269 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" event={"ID":"a4f9b344-67a5-4f16-99b1-d8402f3e44cb","Type":"ContainerDied","Data":"fafef45d1b51fe3411d01f316fc166433bfc54bafed50afca4fca88fe064afd7"} Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.813705 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fafef45d1b51fe3411d01f316fc166433bfc54bafed50afca4fca88fe064afd7" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.813404 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zkx58" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.968447 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz"] Jan 30 19:02:49 crc kubenswrapper[4782]: E0130 19:02:49.969101 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="extract-utilities" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969134 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="extract-utilities" Jan 30 19:02:49 crc kubenswrapper[4782]: E0130 19:02:49.969174 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="extract-content" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969189 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="extract-content" Jan 30 19:02:49 crc kubenswrapper[4782]: E0130 19:02:49.969220 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f9b344-67a5-4f16-99b1-d8402f3e44cb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969268 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f9b344-67a5-4f16-99b1-d8402f3e44cb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:49 crc kubenswrapper[4782]: E0130 19:02:49.969345 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="registry-server" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969367 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="registry-server" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969813 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1adff4-a3c6-4cc8-8d5b-f1d5b93242f6" containerName="registry-server" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.969870 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f9b344-67a5-4f16-99b1-d8402f3e44cb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.971182 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.975017 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.975711 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.976048 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.976779 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.977350 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:02:49 crc kubenswrapper[4782]: I0130 19:02:49.989633 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz"] Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.042790 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.042872 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczqj\" (UniqueName: \"kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.042949 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.042979 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.043059 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.144776 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jczqj\" (UniqueName: \"kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.144833 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.144870 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.144963 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.145076 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.147264 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.151468 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.151468 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.160406 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.168417 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczqj\" (UniqueName: \"kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-qncfz\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.313314 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:02:50 crc kubenswrapper[4782]: I0130 19:02:50.978097 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz"] Jan 30 19:02:51 crc kubenswrapper[4782]: I0130 19:02:51.845643 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" event={"ID":"5e8ffe68-337e-40ee-a941-188e1bad9112","Type":"ContainerStarted","Data":"17d877a21ff7644fb8df0e460436a33cd66b84891b1be4522ac6e64e7b33fb79"} Jan 30 19:02:51 crc kubenswrapper[4782]: I0130 19:02:51.845983 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" event={"ID":"5e8ffe68-337e-40ee-a941-188e1bad9112","Type":"ContainerStarted","Data":"701f69831b714d346c9de5996c11c5c572a6e2fe71cd5bd5457bfdbdea5715df"} Jan 30 19:02:51 crc kubenswrapper[4782]: I0130 19:02:51.873721 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" podStartSLOduration=2.421722265 podStartE2EDuration="2.873699313s" podCreationTimestamp="2026-01-30 19:02:49 +0000 UTC" firstStartedPulling="2026-01-30 19:02:50.996270144 +0000 UTC m=+1947.264648209" lastFinishedPulling="2026-01-30 19:02:51.448247232 +0000 UTC m=+1947.716625257" observedRunningTime="2026-01-30 19:02:51.867572691 +0000 UTC m=+1948.135950736" watchObservedRunningTime="2026-01-30 19:02:51.873699313 +0000 UTC m=+1948.142077338" Jan 30 19:02:57 crc kubenswrapper[4782]: I0130 19:02:57.410677 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:02:57 crc kubenswrapper[4782]: E0130 19:02:57.411316 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:03:11 crc kubenswrapper[4782]: I0130 19:03:11.411362 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:03:11 crc kubenswrapper[4782]: E0130 19:03:11.412171 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:03:23 crc kubenswrapper[4782]: I0130 19:03:23.411120 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:03:24 crc kubenswrapper[4782]: I0130 19:03:24.180549 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3"} Jan 30 19:03:59 crc kubenswrapper[4782]: I0130 19:03:59.581484 4782 generic.go:334] "Generic (PLEG): container finished" podID="5e8ffe68-337e-40ee-a941-188e1bad9112" containerID="17d877a21ff7644fb8df0e460436a33cd66b84891b1be4522ac6e64e7b33fb79" exitCode=0 Jan 30 19:03:59 crc kubenswrapper[4782]: I0130 19:03:59.581581 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" event={"ID":"5e8ffe68-337e-40ee-a941-188e1bad9112","Type":"ContainerDied","Data":"17d877a21ff7644fb8df0e460436a33cd66b84891b1be4522ac6e64e7b33fb79"} Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.019259 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.069309 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0\") pod \"5e8ffe68-337e-40ee-a941-188e1bad9112\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.069403 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jczqj\" (UniqueName: \"kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj\") pod \"5e8ffe68-337e-40ee-a941-188e1bad9112\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.069582 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle\") pod \"5e8ffe68-337e-40ee-a941-188e1bad9112\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.069955 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam\") pod \"5e8ffe68-337e-40ee-a941-188e1bad9112\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.070387 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory\") pod \"5e8ffe68-337e-40ee-a941-188e1bad9112\" (UID: \"5e8ffe68-337e-40ee-a941-188e1bad9112\") " Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.075832 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5e8ffe68-337e-40ee-a941-188e1bad9112" (UID: "5e8ffe68-337e-40ee-a941-188e1bad9112"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.076341 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj" (OuterVolumeSpecName: "kube-api-access-jczqj") pod "5e8ffe68-337e-40ee-a941-188e1bad9112" (UID: "5e8ffe68-337e-40ee-a941-188e1bad9112"). InnerVolumeSpecName "kube-api-access-jczqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.098002 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory" (OuterVolumeSpecName: "inventory") pod "5e8ffe68-337e-40ee-a941-188e1bad9112" (UID: "5e8ffe68-337e-40ee-a941-188e1bad9112"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.112933 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e8ffe68-337e-40ee-a941-188e1bad9112" (UID: "5e8ffe68-337e-40ee-a941-188e1bad9112"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.121409 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5e8ffe68-337e-40ee-a941-188e1bad9112" (UID: "5e8ffe68-337e-40ee-a941-188e1bad9112"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.172799 4782 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5e8ffe68-337e-40ee-a941-188e1bad9112-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.172835 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jczqj\" (UniqueName: \"kubernetes.io/projected/5e8ffe68-337e-40ee-a941-188e1bad9112-kube-api-access-jczqj\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.172849 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.172860 4782 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.172871 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e8ffe68-337e-40ee-a941-188e1bad9112-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.605280 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" event={"ID":"5e8ffe68-337e-40ee-a941-188e1bad9112","Type":"ContainerDied","Data":"701f69831b714d346c9de5996c11c5c572a6e2fe71cd5bd5457bfdbdea5715df"} Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.605653 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701f69831b714d346c9de5996c11c5c572a6e2fe71cd5bd5457bfdbdea5715df" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.605317 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-qncfz" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.765590 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc"] Jan 30 19:04:01 crc kubenswrapper[4782]: E0130 19:04:01.766437 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8ffe68-337e-40ee-a941-188e1bad9112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.766611 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8ffe68-337e-40ee-a941-188e1bad9112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.767051 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8ffe68-337e-40ee-a941-188e1bad9112" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.768611 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773116 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773389 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773415 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773467 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773490 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.773530 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.778556 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc"] Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.787076 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.787426 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.787631 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgh7\" (UniqueName: \"kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.787768 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.787912 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.788177 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890107 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890272 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890311 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890384 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgh7\" (UniqueName: \"kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890414 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.890443 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.895782 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.896029 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.896448 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.897590 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.897794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:01 crc kubenswrapper[4782]: I0130 19:04:01.911166 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgh7\" (UniqueName: \"kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:02 crc kubenswrapper[4782]: I0130 19:04:02.103309 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:02 crc kubenswrapper[4782]: I0130 19:04:02.728652 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc"] Jan 30 19:04:03 crc kubenswrapper[4782]: I0130 19:04:03.626651 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" event={"ID":"39dc3714-072a-4267-812c-49c2aa1efe2d","Type":"ContainerStarted","Data":"1a0b9c85f66a10d32fcd4acdacd97510441c194ccc66e519c7cb9f2374bcfd40"} Jan 30 19:04:03 crc kubenswrapper[4782]: I0130 19:04:03.627046 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" event={"ID":"39dc3714-072a-4267-812c-49c2aa1efe2d","Type":"ContainerStarted","Data":"169f25b9a7b56c4d590211476d2a710f5a4439830d9faf77836f2a7876e8e24a"} Jan 30 19:04:03 crc kubenswrapper[4782]: I0130 19:04:03.663773 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" podStartSLOduration=2.227696931 podStartE2EDuration="2.663752555s" podCreationTimestamp="2026-01-30 19:04:01 +0000 UTC" firstStartedPulling="2026-01-30 19:04:02.737083697 +0000 UTC m=+2019.005461722" lastFinishedPulling="2026-01-30 19:04:03.173139291 +0000 UTC m=+2019.441517346" observedRunningTime="2026-01-30 19:04:03.645074473 +0000 UTC m=+2019.913452508" watchObservedRunningTime="2026-01-30 19:04:03.663752555 +0000 UTC m=+2019.932130590" Jan 30 19:04:52 crc kubenswrapper[4782]: I0130 19:04:52.152005 4782 generic.go:334] "Generic (PLEG): container finished" podID="39dc3714-072a-4267-812c-49c2aa1efe2d" containerID="1a0b9c85f66a10d32fcd4acdacd97510441c194ccc66e519c7cb9f2374bcfd40" exitCode=0 Jan 30 19:04:52 crc kubenswrapper[4782]: I0130 19:04:52.152100 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" event={"ID":"39dc3714-072a-4267-812c-49c2aa1efe2d","Type":"ContainerDied","Data":"1a0b9c85f66a10d32fcd4acdacd97510441c194ccc66e519c7cb9f2374bcfd40"} Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.639882 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.724979 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.725378 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chgh7\" (UniqueName: \"kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.725440 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.725579 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.725627 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.725681 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam\") pod \"39dc3714-072a-4267-812c-49c2aa1efe2d\" (UID: \"39dc3714-072a-4267-812c-49c2aa1efe2d\") " Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.738596 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.738695 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7" (OuterVolumeSpecName: "kube-api-access-chgh7") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "kube-api-access-chgh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.779602 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.781462 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory" (OuterVolumeSpecName: "inventory") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.786756 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.792272 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "39dc3714-072a-4267-812c-49c2aa1efe2d" (UID: "39dc3714-072a-4267-812c-49c2aa1efe2d"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828557 4782 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828609 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828632 4782 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828651 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chgh7\" (UniqueName: \"kubernetes.io/projected/39dc3714-072a-4267-812c-49c2aa1efe2d-kube-api-access-chgh7\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828671 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:53 crc kubenswrapper[4782]: I0130 19:04:53.828690 4782 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/39dc3714-072a-4267-812c-49c2aa1efe2d-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.176835 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" event={"ID":"39dc3714-072a-4267-812c-49c2aa1efe2d","Type":"ContainerDied","Data":"169f25b9a7b56c4d590211476d2a710f5a4439830d9faf77836f2a7876e8e24a"} Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.176897 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169f25b9a7b56c4d590211476d2a710f5a4439830d9faf77836f2a7876e8e24a" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.176912 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.327959 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm"] Jan 30 19:04:54 crc kubenswrapper[4782]: E0130 19:04:54.328547 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39dc3714-072a-4267-812c-49c2aa1efe2d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.328577 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dc3714-072a-4267-812c-49c2aa1efe2d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.329024 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="39dc3714-072a-4267-812c-49c2aa1efe2d" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.330096 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.333441 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.333459 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.333769 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.334298 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.339875 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.353482 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm"] Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.440859 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.440919 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8hm\" (UniqueName: \"kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.440953 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.441002 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.441128 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.544215 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.544412 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8hm\" (UniqueName: \"kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.544492 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.544683 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.544786 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.559695 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.560458 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.560911 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.582869 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8hm\" (UniqueName: \"kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.597220 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8cccm\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:54 crc kubenswrapper[4782]: I0130 19:04:54.664087 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:04:55 crc kubenswrapper[4782]: I0130 19:04:55.249262 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm"] Jan 30 19:04:56 crc kubenswrapper[4782]: I0130 19:04:56.203585 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" event={"ID":"31a7790e-b097-45c3-9088-5fc885e63ef8","Type":"ContainerStarted","Data":"3feebe6564305bc99bddc439b2a5f29c859ff097de90f342af46a3aafd83412f"} Jan 30 19:04:56 crc kubenswrapper[4782]: I0130 19:04:56.203888 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" event={"ID":"31a7790e-b097-45c3-9088-5fc885e63ef8","Type":"ContainerStarted","Data":"70527b6db60439ed79fdd995935a25894c0a6cc3cd7b99ba5ea3286e570a511d"} Jan 30 19:04:56 crc kubenswrapper[4782]: I0130 19:04:56.231976 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" podStartSLOduration=1.789795424 podStartE2EDuration="2.231951249s" podCreationTimestamp="2026-01-30 19:04:54 +0000 UTC" firstStartedPulling="2026-01-30 19:04:55.250627237 +0000 UTC m=+2071.519005272" lastFinishedPulling="2026-01-30 19:04:55.692783072 +0000 UTC m=+2071.961161097" observedRunningTime="2026-01-30 19:04:56.222570537 +0000 UTC m=+2072.490948572" watchObservedRunningTime="2026-01-30 19:04:56.231951249 +0000 UTC m=+2072.500329274" Jan 30 19:05:49 crc kubenswrapper[4782]: I0130 19:05:49.792331 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:05:49 crc kubenswrapper[4782]: I0130 19:05:49.792909 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:06:19 crc kubenswrapper[4782]: I0130 19:06:19.793486 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:06:19 crc kubenswrapper[4782]: I0130 19:06:19.795928 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:06:49 crc kubenswrapper[4782]: I0130 19:06:49.793261 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:06:49 crc kubenswrapper[4782]: I0130 19:06:49.794918 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:06:49 crc kubenswrapper[4782]: I0130 19:06:49.795050 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:06:49 crc kubenswrapper[4782]: I0130 19:06:49.796535 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:06:49 crc kubenswrapper[4782]: I0130 19:06:49.796828 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3" gracePeriod=600 Jan 30 19:06:50 crc kubenswrapper[4782]: I0130 19:06:50.508963 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3" exitCode=0 Jan 30 19:06:50 crc kubenswrapper[4782]: I0130 19:06:50.509023 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3"} Jan 30 19:06:50 crc kubenswrapper[4782]: I0130 19:06:50.509055 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052"} Jan 30 19:06:50 crc kubenswrapper[4782]: I0130 19:06:50.509074 4782 scope.go:117] "RemoveContainer" containerID="3b78b8ee3bb0c6321f07ad70872e81847e900616f081875fc2b3e8e92622b66a" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.585304 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.590211 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.596787 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.737205 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6lf\" (UniqueName: \"kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.737378 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.737419 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.788677 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.791308 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.804328 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.841343 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.841404 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.841495 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6lf\" (UniqueName: \"kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.842213 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.842376 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.864781 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6lf\" (UniqueName: \"kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf\") pod \"certified-operators-6pvqc\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.914976 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.944358 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.944404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:55 crc kubenswrapper[4782]: I0130 19:06:55.944447 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zf8s\" (UniqueName: \"kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.046489 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.046795 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.047073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.047178 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.047553 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zf8s\" (UniqueName: \"kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.068140 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zf8s\" (UniqueName: \"kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s\") pod \"community-operators-mrfpc\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.118756 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.583196 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:06:56 crc kubenswrapper[4782]: I0130 19:06:56.872210 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.591766 4782 generic.go:334] "Generic (PLEG): container finished" podID="be36622d-3651-4140-ab52-4f76eec4eab5" containerID="375053c1c41f3751c9a11f07ec8eb2b75b525b74c5b14dd9793982f9f67ac074" exitCode=0 Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.591880 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerDied","Data":"375053c1c41f3751c9a11f07ec8eb2b75b525b74c5b14dd9793982f9f67ac074"} Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.592244 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerStarted","Data":"a3550c9b90ad6ea1a146241480ac507267382ffcc3fe47bcb477f51f1f160c81"} Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.593997 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.594770 4782 generic.go:334] "Generic (PLEG): container finished" podID="94368d1d-c550-4681-ab18-317352d860be" containerID="928d063080cf665c97a71d45f31676ebf04f4072b4136570914d233e9d51950c" exitCode=0 Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.594810 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerDied","Data":"928d063080cf665c97a71d45f31676ebf04f4072b4136570914d233e9d51950c"} Jan 30 19:06:57 crc kubenswrapper[4782]: I0130 19:06:57.594836 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerStarted","Data":"3838e5fea3eb70a0d4a4d0905019cfbd08895b0e545cd3090aba91f26931d17d"} Jan 30 19:06:58 crc kubenswrapper[4782]: I0130 19:06:58.604635 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerStarted","Data":"0ce90338b05a550c7d92e063cbdf659250aa81a4cd8bf4c6918672b46b0493fd"} Jan 30 19:06:58 crc kubenswrapper[4782]: I0130 19:06:58.987983 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:06:58 crc kubenswrapper[4782]: I0130 19:06:58.990366 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.000536 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.123621 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.124056 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trn7n\" (UniqueName: \"kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.124203 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.226005 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trn7n\" (UniqueName: \"kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.226105 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.226180 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.226757 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.226794 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.248815 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trn7n\" (UniqueName: \"kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n\") pod \"redhat-operators-t4whs\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.305972 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.619263 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerStarted","Data":"13181d59359ae07fcca21cbd6be61e3afbaa6b708e7d1179721d5e3fbef66226"} Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.621030 4782 generic.go:334] "Generic (PLEG): container finished" podID="94368d1d-c550-4681-ab18-317352d860be" containerID="0ce90338b05a550c7d92e063cbdf659250aa81a4cd8bf4c6918672b46b0493fd" exitCode=0 Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.621063 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerDied","Data":"0ce90338b05a550c7d92e063cbdf659250aa81a4cd8bf4c6918672b46b0493fd"} Jan 30 19:06:59 crc kubenswrapper[4782]: W0130 19:06:59.823394 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17460b94_f89d_4b35_a7a6_fb037bd61672.slice/crio-ae5f2bf78b129c4ad179082d81ad3667e855f71979ebf6bf629cc1a8a63cb0fc WatchSource:0}: Error finding container ae5f2bf78b129c4ad179082d81ad3667e855f71979ebf6bf629cc1a8a63cb0fc: Status 404 returned error can't find the container with id ae5f2bf78b129c4ad179082d81ad3667e855f71979ebf6bf629cc1a8a63cb0fc Jan 30 19:06:59 crc kubenswrapper[4782]: I0130 19:06:59.825852 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:07:00 crc kubenswrapper[4782]: I0130 19:07:00.632995 4782 generic.go:334] "Generic (PLEG): container finished" podID="be36622d-3651-4140-ab52-4f76eec4eab5" containerID="13181d59359ae07fcca21cbd6be61e3afbaa6b708e7d1179721d5e3fbef66226" exitCode=0 Jan 30 19:07:00 crc kubenswrapper[4782]: I0130 19:07:00.633074 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerDied","Data":"13181d59359ae07fcca21cbd6be61e3afbaa6b708e7d1179721d5e3fbef66226"} Jan 30 19:07:00 crc kubenswrapper[4782]: I0130 19:07:00.634605 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerStarted","Data":"ae5f2bf78b129c4ad179082d81ad3667e855f71979ebf6bf629cc1a8a63cb0fc"} Jan 30 19:07:03 crc kubenswrapper[4782]: I0130 19:07:03.667891 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerStarted","Data":"951195c8203280bc42847939c56d6b7e17cf018e615a3a598b2c54591ae329ac"} Jan 30 19:07:05 crc kubenswrapper[4782]: I0130 19:07:05.694265 4782 generic.go:334] "Generic (PLEG): container finished" podID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerID="951195c8203280bc42847939c56d6b7e17cf018e615a3a598b2c54591ae329ac" exitCode=0 Jan 30 19:07:05 crc kubenswrapper[4782]: I0130 19:07:05.694343 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerDied","Data":"951195c8203280bc42847939c56d6b7e17cf018e615a3a598b2c54591ae329ac"} Jan 30 19:07:06 crc kubenswrapper[4782]: I0130 19:07:06.727648 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerStarted","Data":"d423f6d80668fcd737acf1ab508643cfd44b9bb14f9ca58b623ebc85ce165ec5"} Jan 30 19:07:06 crc kubenswrapper[4782]: I0130 19:07:06.775920 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mrfpc" podStartSLOduration=3.532588481 podStartE2EDuration="11.775895757s" podCreationTimestamp="2026-01-30 19:06:55 +0000 UTC" firstStartedPulling="2026-01-30 19:06:57.597680234 +0000 UTC m=+2193.866058269" lastFinishedPulling="2026-01-30 19:07:05.84098752 +0000 UTC m=+2202.109365545" observedRunningTime="2026-01-30 19:07:06.768927124 +0000 UTC m=+2203.037305149" watchObservedRunningTime="2026-01-30 19:07:06.775895757 +0000 UTC m=+2203.044273782" Jan 30 19:07:07 crc kubenswrapper[4782]: I0130 19:07:07.766172 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerStarted","Data":"db66c472529f2a0abae3fcedc7a08fc5f456768af5109ac5f7838cadf874673c"} Jan 30 19:07:07 crc kubenswrapper[4782]: I0130 19:07:07.772819 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerStarted","Data":"23d56778b7635d1d3eda125d54762eb226042e91c5956528167ecd404f2cd82d"} Jan 30 19:07:07 crc kubenswrapper[4782]: I0130 19:07:07.819140 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6pvqc" podStartSLOduration=4.020764376 podStartE2EDuration="12.819116809s" podCreationTimestamp="2026-01-30 19:06:55 +0000 UTC" firstStartedPulling="2026-01-30 19:06:57.593608042 +0000 UTC m=+2193.861986087" lastFinishedPulling="2026-01-30 19:07:06.391960475 +0000 UTC m=+2202.660338520" observedRunningTime="2026-01-30 19:07:07.812657229 +0000 UTC m=+2204.081035254" watchObservedRunningTime="2026-01-30 19:07:07.819116809 +0000 UTC m=+2204.087494834" Jan 30 19:07:10 crc kubenswrapper[4782]: I0130 19:07:10.803123 4782 generic.go:334] "Generic (PLEG): container finished" podID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerID="db66c472529f2a0abae3fcedc7a08fc5f456768af5109ac5f7838cadf874673c" exitCode=0 Jan 30 19:07:10 crc kubenswrapper[4782]: I0130 19:07:10.803216 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerDied","Data":"db66c472529f2a0abae3fcedc7a08fc5f456768af5109ac5f7838cadf874673c"} Jan 30 19:07:11 crc kubenswrapper[4782]: I0130 19:07:11.816486 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerStarted","Data":"15b04990df23909a37ccad9e9d663bf6b95c56a785046107b135a0504240c304"} Jan 30 19:07:11 crc kubenswrapper[4782]: I0130 19:07:11.838109 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t4whs" podStartSLOduration=8.239953775 podStartE2EDuration="13.838090351s" podCreationTimestamp="2026-01-30 19:06:58 +0000 UTC" firstStartedPulling="2026-01-30 19:07:05.696362104 +0000 UTC m=+2201.964740129" lastFinishedPulling="2026-01-30 19:07:11.29449869 +0000 UTC m=+2207.562876705" observedRunningTime="2026-01-30 19:07:11.833659871 +0000 UTC m=+2208.102037896" watchObservedRunningTime="2026-01-30 19:07:11.838090351 +0000 UTC m=+2208.106468376" Jan 30 19:07:15 crc kubenswrapper[4782]: I0130 19:07:15.915824 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:15 crc kubenswrapper[4782]: I0130 19:07:15.916285 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:16 crc kubenswrapper[4782]: I0130 19:07:16.119350 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:16 crc kubenswrapper[4782]: I0130 19:07:16.119398 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:16 crc kubenswrapper[4782]: I0130 19:07:16.971856 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6pvqc" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="registry-server" probeResult="failure" output=< Jan 30 19:07:16 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:07:16 crc kubenswrapper[4782]: > Jan 30 19:07:17 crc kubenswrapper[4782]: I0130 19:07:17.173523 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mrfpc" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="registry-server" probeResult="failure" output=< Jan 30 19:07:17 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:07:17 crc kubenswrapper[4782]: > Jan 30 19:07:19 crc kubenswrapper[4782]: I0130 19:07:19.306327 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:19 crc kubenswrapper[4782]: I0130 19:07:19.306400 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:20 crc kubenswrapper[4782]: I0130 19:07:20.366700 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4whs" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" probeResult="failure" output=< Jan 30 19:07:20 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:07:20 crc kubenswrapper[4782]: > Jan 30 19:07:25 crc kubenswrapper[4782]: I0130 19:07:25.994613 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:26 crc kubenswrapper[4782]: I0130 19:07:26.053947 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:26 crc kubenswrapper[4782]: I0130 19:07:26.179789 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:26 crc kubenswrapper[4782]: I0130 19:07:26.224577 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.656899 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.658593 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mrfpc" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="registry-server" containerID="cri-o://d423f6d80668fcd737acf1ab508643cfd44b9bb14f9ca58b623ebc85ce165ec5" gracePeriod=2 Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.843857 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.844535 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6pvqc" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="registry-server" containerID="cri-o://23d56778b7635d1d3eda125d54762eb226042e91c5956528167ecd404f2cd82d" gracePeriod=2 Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.993700 4782 generic.go:334] "Generic (PLEG): container finished" podID="be36622d-3651-4140-ab52-4f76eec4eab5" containerID="23d56778b7635d1d3eda125d54762eb226042e91c5956528167ecd404f2cd82d" exitCode=0 Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.993767 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerDied","Data":"23d56778b7635d1d3eda125d54762eb226042e91c5956528167ecd404f2cd82d"} Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.997430 4782 generic.go:334] "Generic (PLEG): container finished" podID="94368d1d-c550-4681-ab18-317352d860be" containerID="d423f6d80668fcd737acf1ab508643cfd44b9bb14f9ca58b623ebc85ce165ec5" exitCode=0 Jan 30 19:07:28 crc kubenswrapper[4782]: I0130 19:07:28.997485 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerDied","Data":"d423f6d80668fcd737acf1ab508643cfd44b9bb14f9ca58b623ebc85ce165ec5"} Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.132381 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.219306 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities\") pod \"94368d1d-c550-4681-ab18-317352d860be\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.219362 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content\") pod \"94368d1d-c550-4681-ab18-317352d860be\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.219647 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zf8s\" (UniqueName: \"kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s\") pod \"94368d1d-c550-4681-ab18-317352d860be\" (UID: \"94368d1d-c550-4681-ab18-317352d860be\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.220354 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities" (OuterVolumeSpecName: "utilities") pod "94368d1d-c550-4681-ab18-317352d860be" (UID: "94368d1d-c550-4681-ab18-317352d860be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.225779 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s" (OuterVolumeSpecName: "kube-api-access-8zf8s") pod "94368d1d-c550-4681-ab18-317352d860be" (UID: "94368d1d-c550-4681-ab18-317352d860be"). InnerVolumeSpecName "kube-api-access-8zf8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.265063 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.267757 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94368d1d-c550-4681-ab18-317352d860be" (UID: "94368d1d-c550-4681-ab18-317352d860be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.321941 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zf8s\" (UniqueName: \"kubernetes.io/projected/94368d1d-c550-4681-ab18-317352d860be-kube-api-access-8zf8s\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.321978 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.321990 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94368d1d-c550-4681-ab18-317352d860be-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.422998 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content\") pod \"be36622d-3651-4140-ab52-4f76eec4eab5\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.423666 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities\") pod \"be36622d-3651-4140-ab52-4f76eec4eab5\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.423811 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6lf\" (UniqueName: \"kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf\") pod \"be36622d-3651-4140-ab52-4f76eec4eab5\" (UID: \"be36622d-3651-4140-ab52-4f76eec4eab5\") " Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.424351 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities" (OuterVolumeSpecName: "utilities") pod "be36622d-3651-4140-ab52-4f76eec4eab5" (UID: "be36622d-3651-4140-ab52-4f76eec4eab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.424534 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.426639 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf" (OuterVolumeSpecName: "kube-api-access-bb6lf") pod "be36622d-3651-4140-ab52-4f76eec4eab5" (UID: "be36622d-3651-4140-ab52-4f76eec4eab5"). InnerVolumeSpecName "kube-api-access-bb6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.471996 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be36622d-3651-4140-ab52-4f76eec4eab5" (UID: "be36622d-3651-4140-ab52-4f76eec4eab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.527741 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6lf\" (UniqueName: \"kubernetes.io/projected/be36622d-3651-4140-ab52-4f76eec4eab5-kube-api-access-bb6lf\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:29 crc kubenswrapper[4782]: I0130 19:07:29.527792 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be36622d-3651-4140-ab52-4f76eec4eab5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.013952 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mrfpc" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.013973 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mrfpc" event={"ID":"94368d1d-c550-4681-ab18-317352d860be","Type":"ContainerDied","Data":"3838e5fea3eb70a0d4a4d0905019cfbd08895b0e545cd3090aba91f26931d17d"} Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.014072 4782 scope.go:117] "RemoveContainer" containerID="d423f6d80668fcd737acf1ab508643cfd44b9bb14f9ca58b623ebc85ce165ec5" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.017897 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6pvqc" event={"ID":"be36622d-3651-4140-ab52-4f76eec4eab5","Type":"ContainerDied","Data":"a3550c9b90ad6ea1a146241480ac507267382ffcc3fe47bcb477f51f1f160c81"} Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.018014 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6pvqc" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.043479 4782 scope.go:117] "RemoveContainer" containerID="0ce90338b05a550c7d92e063cbdf659250aa81a4cd8bf4c6918672b46b0493fd" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.070549 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.079790 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mrfpc"] Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.099539 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.104716 4782 scope.go:117] "RemoveContainer" containerID="928d063080cf665c97a71d45f31676ebf04f4072b4136570914d233e9d51950c" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.106946 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6pvqc"] Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.136584 4782 scope.go:117] "RemoveContainer" containerID="23d56778b7635d1d3eda125d54762eb226042e91c5956528167ecd404f2cd82d" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.175914 4782 scope.go:117] "RemoveContainer" containerID="13181d59359ae07fcca21cbd6be61e3afbaa6b708e7d1179721d5e3fbef66226" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.205387 4782 scope.go:117] "RemoveContainer" containerID="375053c1c41f3751c9a11f07ec8eb2b75b525b74c5b14dd9793982f9f67ac074" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.369018 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4whs" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" probeResult="failure" output=< Jan 30 19:07:30 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:07:30 crc kubenswrapper[4782]: > Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.427249 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94368d1d-c550-4681-ab18-317352d860be" path="/var/lib/kubelet/pods/94368d1d-c550-4681-ab18-317352d860be/volumes" Jan 30 19:07:30 crc kubenswrapper[4782]: I0130 19:07:30.427909 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" path="/var/lib/kubelet/pods/be36622d-3651-4140-ab52-4f76eec4eab5/volumes" Jan 30 19:07:40 crc kubenswrapper[4782]: I0130 19:07:40.389323 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t4whs" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" probeResult="failure" output=< Jan 30 19:07:40 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:07:40 crc kubenswrapper[4782]: > Jan 30 19:07:49 crc kubenswrapper[4782]: I0130 19:07:49.377375 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:49 crc kubenswrapper[4782]: I0130 19:07:49.464002 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:49 crc kubenswrapper[4782]: I0130 19:07:49.633642 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:07:51 crc kubenswrapper[4782]: I0130 19:07:51.255385 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t4whs" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" containerID="cri-o://15b04990df23909a37ccad9e9d663bf6b95c56a785046107b135a0504240c304" gracePeriod=2 Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.294064 4782 generic.go:334] "Generic (PLEG): container finished" podID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerID="15b04990df23909a37ccad9e9d663bf6b95c56a785046107b135a0504240c304" exitCode=0 Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.294483 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerDied","Data":"15b04990df23909a37ccad9e9d663bf6b95c56a785046107b135a0504240c304"} Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.444783 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.574148 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities\") pod \"17460b94-f89d-4b35-a7a6-fb037bd61672\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.574279 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trn7n\" (UniqueName: \"kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n\") pod \"17460b94-f89d-4b35-a7a6-fb037bd61672\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.574330 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content\") pod \"17460b94-f89d-4b35-a7a6-fb037bd61672\" (UID: \"17460b94-f89d-4b35-a7a6-fb037bd61672\") " Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.574934 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities" (OuterVolumeSpecName: "utilities") pod "17460b94-f89d-4b35-a7a6-fb037bd61672" (UID: "17460b94-f89d-4b35-a7a6-fb037bd61672"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.591688 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n" (OuterVolumeSpecName: "kube-api-access-trn7n") pod "17460b94-f89d-4b35-a7a6-fb037bd61672" (UID: "17460b94-f89d-4b35-a7a6-fb037bd61672"). InnerVolumeSpecName "kube-api-access-trn7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.676823 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.676854 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trn7n\" (UniqueName: \"kubernetes.io/projected/17460b94-f89d-4b35-a7a6-fb037bd61672-kube-api-access-trn7n\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.699788 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17460b94-f89d-4b35-a7a6-fb037bd61672" (UID: "17460b94-f89d-4b35-a7a6-fb037bd61672"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:07:52 crc kubenswrapper[4782]: I0130 19:07:52.778558 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17460b94-f89d-4b35-a7a6-fb037bd61672-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.304528 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t4whs" event={"ID":"17460b94-f89d-4b35-a7a6-fb037bd61672","Type":"ContainerDied","Data":"ae5f2bf78b129c4ad179082d81ad3667e855f71979ebf6bf629cc1a8a63cb0fc"} Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.304607 4782 scope.go:117] "RemoveContainer" containerID="15b04990df23909a37ccad9e9d663bf6b95c56a785046107b135a0504240c304" Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.304612 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t4whs" Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.339696 4782 scope.go:117] "RemoveContainer" containerID="db66c472529f2a0abae3fcedc7a08fc5f456768af5109ac5f7838cadf874673c" Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.348515 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.356941 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t4whs"] Jan 30 19:07:53 crc kubenswrapper[4782]: I0130 19:07:53.369196 4782 scope.go:117] "RemoveContainer" containerID="951195c8203280bc42847939c56d6b7e17cf018e615a3a598b2c54591ae329ac" Jan 30 19:07:54 crc kubenswrapper[4782]: I0130 19:07:54.421928 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" path="/var/lib/kubelet/pods/17460b94-f89d-4b35-a7a6-fb037bd61672/volumes" Jan 30 19:08:46 crc kubenswrapper[4782]: I0130 19:08:46.924597 4782 generic.go:334] "Generic (PLEG): container finished" podID="31a7790e-b097-45c3-9088-5fc885e63ef8" containerID="3feebe6564305bc99bddc439b2a5f29c859ff097de90f342af46a3aafd83412f" exitCode=0 Jan 30 19:08:46 crc kubenswrapper[4782]: I0130 19:08:46.924707 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" event={"ID":"31a7790e-b097-45c3-9088-5fc885e63ef8","Type":"ContainerDied","Data":"3feebe6564305bc99bddc439b2a5f29c859ff097de90f342af46a3aafd83412f"} Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.371722 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.440185 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam\") pod \"31a7790e-b097-45c3-9088-5fc885e63ef8\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.440379 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory\") pod \"31a7790e-b097-45c3-9088-5fc885e63ef8\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.440422 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0\") pod \"31a7790e-b097-45c3-9088-5fc885e63ef8\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.440547 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle\") pod \"31a7790e-b097-45c3-9088-5fc885e63ef8\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.440685 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj8hm\" (UniqueName: \"kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm\") pod \"31a7790e-b097-45c3-9088-5fc885e63ef8\" (UID: \"31a7790e-b097-45c3-9088-5fc885e63ef8\") " Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.446347 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "31a7790e-b097-45c3-9088-5fc885e63ef8" (UID: "31a7790e-b097-45c3-9088-5fc885e63ef8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.449607 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm" (OuterVolumeSpecName: "kube-api-access-hj8hm") pod "31a7790e-b097-45c3-9088-5fc885e63ef8" (UID: "31a7790e-b097-45c3-9088-5fc885e63ef8"). InnerVolumeSpecName "kube-api-access-hj8hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.482684 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "31a7790e-b097-45c3-9088-5fc885e63ef8" (UID: "31a7790e-b097-45c3-9088-5fc885e63ef8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.482747 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31a7790e-b097-45c3-9088-5fc885e63ef8" (UID: "31a7790e-b097-45c3-9088-5fc885e63ef8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.485027 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory" (OuterVolumeSpecName: "inventory") pod "31a7790e-b097-45c3-9088-5fc885e63ef8" (UID: "31a7790e-b097-45c3-9088-5fc885e63ef8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.543901 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj8hm\" (UniqueName: \"kubernetes.io/projected/31a7790e-b097-45c3-9088-5fc885e63ef8-kube-api-access-hj8hm\") on node \"crc\" DevicePath \"\"" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.543944 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.543956 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.543968 4782 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.543979 4782 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a7790e-b097-45c3-9088-5fc885e63ef8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.954726 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" event={"ID":"31a7790e-b097-45c3-9088-5fc885e63ef8","Type":"ContainerDied","Data":"70527b6db60439ed79fdd995935a25894c0a6cc3cd7b99ba5ea3286e570a511d"} Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.954774 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70527b6db60439ed79fdd995935a25894c0a6cc3cd7b99ba5ea3286e570a511d" Jan 30 19:08:48 crc kubenswrapper[4782]: I0130 19:08:48.954902 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8cccm" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.059747 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8"] Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060198 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060243 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060259 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060267 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060286 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060295 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060312 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060319 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060341 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060349 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060359 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a7790e-b097-45c3-9088-5fc885e63ef8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060368 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a7790e-b097-45c3-9088-5fc885e63ef8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060387 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060394 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060405 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060413 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060425 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060431 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="extract-content" Jan 30 19:08:49 crc kubenswrapper[4782]: E0130 19:08:49.060449 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060458 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="extract-utilities" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060690 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="be36622d-3651-4140-ab52-4f76eec4eab5" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060709 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="94368d1d-c550-4681-ab18-317352d860be" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060720 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a7790e-b097-45c3-9088-5fc885e63ef8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.060735 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="17460b94-f89d-4b35-a7a6-fb037bd61672" containerName="registry-server" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.061712 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.064792 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.065197 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.065633 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.066011 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.066293 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.066681 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.068253 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.079868 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8"] Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.160368 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.160674 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.160871 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161078 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161191 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161355 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s5pq\" (UniqueName: \"kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161481 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161675 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.161827 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.263755 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264188 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264329 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264425 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s5pq\" (UniqueName: \"kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264529 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264663 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264748 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264848 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.264926 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.266076 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.269043 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.270410 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.271999 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.279310 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.282824 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.283776 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.286548 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.290857 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s5pq\" (UniqueName: \"kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-f8ss8\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:49 crc kubenswrapper[4782]: I0130 19:08:49.427493 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:08:50 crc kubenswrapper[4782]: I0130 19:08:50.042153 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8"] Jan 30 19:08:50 crc kubenswrapper[4782]: W0130 19:08:50.049334 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e19180a_524d_4e70_8e9a_e72c69f07d7c.slice/crio-2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9 WatchSource:0}: Error finding container 2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9: Status 404 returned error can't find the container with id 2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9 Jan 30 19:08:50 crc kubenswrapper[4782]: I0130 19:08:50.979690 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" event={"ID":"6e19180a-524d-4e70-8e9a-e72c69f07d7c","Type":"ContainerStarted","Data":"d90a326c64eb9261805faf7914ffe55af12e4d5503a1dc552f5bb2de35ecfb61"} Jan 30 19:08:50 crc kubenswrapper[4782]: I0130 19:08:50.979999 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" event={"ID":"6e19180a-524d-4e70-8e9a-e72c69f07d7c","Type":"ContainerStarted","Data":"2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9"} Jan 30 19:08:51 crc kubenswrapper[4782]: I0130 19:08:51.012024 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" podStartSLOduration=1.463484729 podStartE2EDuration="2.011990473s" podCreationTimestamp="2026-01-30 19:08:49 +0000 UTC" firstStartedPulling="2026-01-30 19:08:50.051362459 +0000 UTC m=+2306.319740484" lastFinishedPulling="2026-01-30 19:08:50.599868203 +0000 UTC m=+2306.868246228" observedRunningTime="2026-01-30 19:08:51.007148083 +0000 UTC m=+2307.275526148" watchObservedRunningTime="2026-01-30 19:08:51.011990473 +0000 UTC m=+2307.280368598" Jan 30 19:09:19 crc kubenswrapper[4782]: I0130 19:09:19.792245 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:09:19 crc kubenswrapper[4782]: I0130 19:09:19.792863 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:09:49 crc kubenswrapper[4782]: I0130 19:09:49.792701 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:09:49 crc kubenswrapper[4782]: I0130 19:09:49.793437 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.793540 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.794478 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.794550 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.795253 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.795325 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" gracePeriod=600 Jan 30 19:10:19 crc kubenswrapper[4782]: E0130 19:10:19.917604 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.984611 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" exitCode=0 Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.984659 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052"} Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.984738 4782 scope.go:117] "RemoveContainer" containerID="a8248bdd536385dad9e4a7ba9de482d0518f9893a440220a80b84827335dcbf3" Jan 30 19:10:19 crc kubenswrapper[4782]: I0130 19:10:19.985363 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:10:19 crc kubenswrapper[4782]: E0130 19:10:19.985599 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:10:34 crc kubenswrapper[4782]: I0130 19:10:34.423973 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:10:34 crc kubenswrapper[4782]: E0130 19:10:34.425170 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:10:46 crc kubenswrapper[4782]: I0130 19:10:46.411817 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:10:46 crc kubenswrapper[4782]: E0130 19:10:46.412799 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:10:58 crc kubenswrapper[4782]: I0130 19:10:58.411098 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:10:58 crc kubenswrapper[4782]: E0130 19:10:58.411903 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:11:05 crc kubenswrapper[4782]: I0130 19:11:05.468485 4782 generic.go:334] "Generic (PLEG): container finished" podID="6e19180a-524d-4e70-8e9a-e72c69f07d7c" containerID="d90a326c64eb9261805faf7914ffe55af12e4d5503a1dc552f5bb2de35ecfb61" exitCode=0 Jan 30 19:11:05 crc kubenswrapper[4782]: I0130 19:11:05.468646 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" event={"ID":"6e19180a-524d-4e70-8e9a-e72c69f07d7c","Type":"ContainerDied","Data":"d90a326c64eb9261805faf7914ffe55af12e4d5503a1dc552f5bb2de35ecfb61"} Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.902334 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908322 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908422 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908460 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s5pq\" (UniqueName: \"kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908523 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908633 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908667 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908706 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908827 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.908944 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory\") pod \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\" (UID: \"6e19180a-524d-4e70-8e9a-e72c69f07d7c\") " Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.915476 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.927764 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq" (OuterVolumeSpecName: "kube-api-access-5s5pq") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "kube-api-access-5s5pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.965459 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.968193 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.969387 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.971479 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.971946 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory" (OuterVolumeSpecName: "inventory") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.975732 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:06 crc kubenswrapper[4782]: I0130 19:11:06.992894 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6e19180a-524d-4e70-8e9a-e72c69f07d7c" (UID: "6e19180a-524d-4e70-8e9a-e72c69f07d7c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012202 4782 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012486 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012564 4782 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012628 4782 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012692 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s5pq\" (UniqueName: \"kubernetes.io/projected/6e19180a-524d-4e70-8e9a-e72c69f07d7c-kube-api-access-5s5pq\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012750 4782 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012809 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012915 4782 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.012981 4782 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6e19180a-524d-4e70-8e9a-e72c69f07d7c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.492945 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" event={"ID":"6e19180a-524d-4e70-8e9a-e72c69f07d7c","Type":"ContainerDied","Data":"2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9"} Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.493009 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2256c4b7eb5c3f9a45d245b50b1fc6c10e7a2f1c4fb6f693a636d4a8edce6ba9" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.492972 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-f8ss8" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.633870 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt"] Jan 30 19:11:07 crc kubenswrapper[4782]: E0130 19:11:07.634457 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e19180a-524d-4e70-8e9a-e72c69f07d7c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.634485 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e19180a-524d-4e70-8e9a-e72c69f07d7c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.634744 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e19180a-524d-4e70-8e9a-e72c69f07d7c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.635604 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.637510 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.639372 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.639379 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.640182 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8zx4c" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.640202 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.657250 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt"] Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730128 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730193 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730284 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730486 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730568 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730678 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wlcz\" (UniqueName: \"kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.730776 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.832338 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wlcz\" (UniqueName: \"kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.832407 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.832517 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.832543 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.833337 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.833388 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.833410 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.846063 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.846578 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.851004 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.851455 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.853790 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.853972 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.854649 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wlcz\" (UniqueName: \"kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:07 crc kubenswrapper[4782]: I0130 19:11:07.955819 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:11:08 crc kubenswrapper[4782]: I0130 19:11:08.561544 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt"] Jan 30 19:11:09 crc kubenswrapper[4782]: I0130 19:11:09.515541 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" event={"ID":"0055025f-d7c7-4469-9791-ffcb0bbdfef4","Type":"ContainerStarted","Data":"64f7ef0149d7121439c1017015c415a94bd616a26fd50f9917894afaa3da6be4"} Jan 30 19:11:09 crc kubenswrapper[4782]: I0130 19:11:09.516111 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" event={"ID":"0055025f-d7c7-4469-9791-ffcb0bbdfef4","Type":"ContainerStarted","Data":"d9cb12ebe78b54381dce923346ca431ef938e90c52f2802e17620dfdd713f7bb"} Jan 30 19:11:09 crc kubenswrapper[4782]: I0130 19:11:09.557736 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" podStartSLOduration=2.032265441 podStartE2EDuration="2.557674683s" podCreationTimestamp="2026-01-30 19:11:07 +0000 UTC" firstStartedPulling="2026-01-30 19:11:08.555184022 +0000 UTC m=+2444.823562047" lastFinishedPulling="2026-01-30 19:11:09.080593244 +0000 UTC m=+2445.348971289" observedRunningTime="2026-01-30 19:11:09.541153803 +0000 UTC m=+2445.809531878" watchObservedRunningTime="2026-01-30 19:11:09.557674683 +0000 UTC m=+2445.826052738" Jan 30 19:11:10 crc kubenswrapper[4782]: I0130 19:11:10.411463 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:11:10 crc kubenswrapper[4782]: E0130 19:11:10.411728 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:11:24 crc kubenswrapper[4782]: I0130 19:11:24.420458 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:11:24 crc kubenswrapper[4782]: E0130 19:11:24.423125 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:11:39 crc kubenswrapper[4782]: I0130 19:11:39.426813 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:11:39 crc kubenswrapper[4782]: E0130 19:11:39.427953 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:11:53 crc kubenswrapper[4782]: I0130 19:11:53.410944 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:11:53 crc kubenswrapper[4782]: E0130 19:11:53.411799 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:12:08 crc kubenswrapper[4782]: I0130 19:12:08.411690 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:12:08 crc kubenswrapper[4782]: E0130 19:12:08.412912 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:12:21 crc kubenswrapper[4782]: I0130 19:12:21.411533 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:12:21 crc kubenswrapper[4782]: E0130 19:12:21.412449 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:12:34 crc kubenswrapper[4782]: I0130 19:12:34.423866 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:12:34 crc kubenswrapper[4782]: E0130 19:12:34.426162 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:12:48 crc kubenswrapper[4782]: I0130 19:12:48.411427 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:12:48 crc kubenswrapper[4782]: E0130 19:12:48.412413 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.411262 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:13:00 crc kubenswrapper[4782]: E0130 19:13:00.412048 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.644523 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.647037 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.659217 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.795437 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsxc8\" (UniqueName: \"kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.795542 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.795592 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.897654 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsxc8\" (UniqueName: \"kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.897742 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.897777 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.898344 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.898528 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.917171 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsxc8\" (UniqueName: \"kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8\") pod \"redhat-marketplace-28qf6\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:00 crc kubenswrapper[4782]: I0130 19:13:00.978357 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:01 crc kubenswrapper[4782]: I0130 19:13:01.552779 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:02 crc kubenswrapper[4782]: I0130 19:13:02.289029 4782 generic.go:334] "Generic (PLEG): container finished" podID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerID="7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc" exitCode=0 Jan 30 19:13:02 crc kubenswrapper[4782]: I0130 19:13:02.289456 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerDied","Data":"7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc"} Jan 30 19:13:02 crc kubenswrapper[4782]: I0130 19:13:02.289594 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerStarted","Data":"608624058d4c3b5d88125117d1cf51f60f0c7cdce1707bf81559cc506aee5671"} Jan 30 19:13:02 crc kubenswrapper[4782]: I0130 19:13:02.291789 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:13:03 crc kubenswrapper[4782]: I0130 19:13:03.301511 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerStarted","Data":"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9"} Jan 30 19:13:04 crc kubenswrapper[4782]: I0130 19:13:04.338238 4782 generic.go:334] "Generic (PLEG): container finished" podID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerID="627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9" exitCode=0 Jan 30 19:13:04 crc kubenswrapper[4782]: I0130 19:13:04.338291 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerDied","Data":"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9"} Jan 30 19:13:05 crc kubenswrapper[4782]: I0130 19:13:05.351015 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerStarted","Data":"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255"} Jan 30 19:13:05 crc kubenswrapper[4782]: I0130 19:13:05.377336 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28qf6" podStartSLOduration=2.825836456 podStartE2EDuration="5.377316523s" podCreationTimestamp="2026-01-30 19:13:00 +0000 UTC" firstStartedPulling="2026-01-30 19:13:02.291554095 +0000 UTC m=+2558.559932120" lastFinishedPulling="2026-01-30 19:13:04.843034152 +0000 UTC m=+2561.111412187" observedRunningTime="2026-01-30 19:13:05.368469444 +0000 UTC m=+2561.636847479" watchObservedRunningTime="2026-01-30 19:13:05.377316523 +0000 UTC m=+2561.645694548" Jan 30 19:13:07 crc kubenswrapper[4782]: I0130 19:13:07.373913 4782 generic.go:334] "Generic (PLEG): container finished" podID="0055025f-d7c7-4469-9791-ffcb0bbdfef4" containerID="64f7ef0149d7121439c1017015c415a94bd616a26fd50f9917894afaa3da6be4" exitCode=0 Jan 30 19:13:07 crc kubenswrapper[4782]: I0130 19:13:07.373964 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" event={"ID":"0055025f-d7c7-4469-9791-ffcb0bbdfef4","Type":"ContainerDied","Data":"64f7ef0149d7121439c1017015c415a94bd616a26fd50f9917894afaa3da6be4"} Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.846757 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.981323 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.982842 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.982887 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.982971 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wlcz\" (UniqueName: \"kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.983014 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.983099 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.983353 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1\") pod \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\" (UID: \"0055025f-d7c7-4469-9791-ffcb0bbdfef4\") " Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.993575 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz" (OuterVolumeSpecName: "kube-api-access-8wlcz") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "kube-api-access-8wlcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:13:08 crc kubenswrapper[4782]: I0130 19:13:08.993575 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.014025 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory" (OuterVolumeSpecName: "inventory") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.015609 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.018089 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.019722 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.021948 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0055025f-d7c7-4469-9791-ffcb0bbdfef4" (UID: "0055025f-d7c7-4469-9791-ffcb0bbdfef4"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085780 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wlcz\" (UniqueName: \"kubernetes.io/projected/0055025f-d7c7-4469-9791-ffcb0bbdfef4-kube-api-access-8wlcz\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085819 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085833 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085846 4782 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085855 4782 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085865 4782 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.085874 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0055025f-d7c7-4469-9791-ffcb0bbdfef4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.408826 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" event={"ID":"0055025f-d7c7-4469-9791-ffcb0bbdfef4","Type":"ContainerDied","Data":"d9cb12ebe78b54381dce923346ca431ef938e90c52f2802e17620dfdd713f7bb"} Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.408873 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9cb12ebe78b54381dce923346ca431ef938e90c52f2802e17620dfdd713f7bb" Jan 30 19:13:09 crc kubenswrapper[4782]: I0130 19:13:09.408923 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt" Jan 30 19:13:10 crc kubenswrapper[4782]: I0130 19:13:10.979395 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:10 crc kubenswrapper[4782]: I0130 19:13:10.979789 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:11 crc kubenswrapper[4782]: I0130 19:13:11.031014 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:11 crc kubenswrapper[4782]: I0130 19:13:11.511087 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:11 crc kubenswrapper[4782]: I0130 19:13:11.582432 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:13 crc kubenswrapper[4782]: I0130 19:13:13.450627 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-28qf6" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="registry-server" containerID="cri-o://98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255" gracePeriod=2 Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.121136 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.201412 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities\") pod \"feb29bf7-65dc-4512-baaa-2fca27c58bff\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.201517 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content\") pod \"feb29bf7-65dc-4512-baaa-2fca27c58bff\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.201639 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsxc8\" (UniqueName: \"kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8\") pod \"feb29bf7-65dc-4512-baaa-2fca27c58bff\" (UID: \"feb29bf7-65dc-4512-baaa-2fca27c58bff\") " Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.208886 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8" (OuterVolumeSpecName: "kube-api-access-hsxc8") pod "feb29bf7-65dc-4512-baaa-2fca27c58bff" (UID: "feb29bf7-65dc-4512-baaa-2fca27c58bff"). InnerVolumeSpecName "kube-api-access-hsxc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.209879 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities" (OuterVolumeSpecName: "utilities") pod "feb29bf7-65dc-4512-baaa-2fca27c58bff" (UID: "feb29bf7-65dc-4512-baaa-2fca27c58bff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.243742 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feb29bf7-65dc-4512-baaa-2fca27c58bff" (UID: "feb29bf7-65dc-4512-baaa-2fca27c58bff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.303978 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsxc8\" (UniqueName: \"kubernetes.io/projected/feb29bf7-65dc-4512-baaa-2fca27c58bff-kube-api-access-hsxc8\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.304014 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.304024 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feb29bf7-65dc-4512-baaa-2fca27c58bff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.419741 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:13:14 crc kubenswrapper[4782]: E0130 19:13:14.420161 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.460582 4782 generic.go:334] "Generic (PLEG): container finished" podID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerID="98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255" exitCode=0 Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.460635 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerDied","Data":"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255"} Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.460640 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28qf6" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.460671 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28qf6" event={"ID":"feb29bf7-65dc-4512-baaa-2fca27c58bff","Type":"ContainerDied","Data":"608624058d4c3b5d88125117d1cf51f60f0c7cdce1707bf81559cc506aee5671"} Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.460749 4782 scope.go:117] "RemoveContainer" containerID="98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.491315 4782 scope.go:117] "RemoveContainer" containerID="627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.496486 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.516494 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-28qf6"] Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.516780 4782 scope.go:117] "RemoveContainer" containerID="7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.579846 4782 scope.go:117] "RemoveContainer" containerID="98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255" Jan 30 19:13:14 crc kubenswrapper[4782]: E0130 19:13:14.580265 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255\": container with ID starting with 98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255 not found: ID does not exist" containerID="98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.580316 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255"} err="failed to get container status \"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255\": rpc error: code = NotFound desc = could not find container \"98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255\": container with ID starting with 98d868e5bb0b621be9af52aa882fab3528d43b5b8165eded09607c7f04260255 not found: ID does not exist" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.580348 4782 scope.go:117] "RemoveContainer" containerID="627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9" Jan 30 19:13:14 crc kubenswrapper[4782]: E0130 19:13:14.580797 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9\": container with ID starting with 627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9 not found: ID does not exist" containerID="627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.580829 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9"} err="failed to get container status \"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9\": rpc error: code = NotFound desc = could not find container \"627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9\": container with ID starting with 627db7de943b7c2c0564a670bd2207b02b13204cad670ccad8eb59977a3ccaf9 not found: ID does not exist" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.580852 4782 scope.go:117] "RemoveContainer" containerID="7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc" Jan 30 19:13:14 crc kubenswrapper[4782]: E0130 19:13:14.581091 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc\": container with ID starting with 7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc not found: ID does not exist" containerID="7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc" Jan 30 19:13:14 crc kubenswrapper[4782]: I0130 19:13:14.581115 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc"} err="failed to get container status \"7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc\": rpc error: code = NotFound desc = could not find container \"7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc\": container with ID starting with 7f8aae697cd9ac660feb434015885350ce94fa4b41a6d635a92e7570dd84acfc not found: ID does not exist" Jan 30 19:13:16 crc kubenswrapper[4782]: I0130 19:13:16.420531 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" path="/var/lib/kubelet/pods/feb29bf7-65dc-4512-baaa-2fca27c58bff/volumes" Jan 30 19:13:27 crc kubenswrapper[4782]: I0130 19:13:27.411680 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:13:27 crc kubenswrapper[4782]: E0130 19:13:27.412330 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:40 crc kubenswrapper[4782]: I0130 19:13:40.413443 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:13:40 crc kubenswrapper[4782]: E0130 19:13:40.417553 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.654485 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: E0130 19:13:42.656055 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0055025f-d7c7-4469-9791-ffcb0bbdfef4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.656158 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0055025f-d7c7-4469-9791-ffcb0bbdfef4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 19:13:42 crc kubenswrapper[4782]: E0130 19:13:42.656271 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="extract-content" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.656371 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="extract-content" Jan 30 19:13:42 crc kubenswrapper[4782]: E0130 19:13:42.656479 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="extract-utilities" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.656566 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="extract-utilities" Jan 30 19:13:42 crc kubenswrapper[4782]: E0130 19:13:42.656662 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="registry-server" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.656741 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="registry-server" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.657057 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb29bf7-65dc-4512-baaa-2fca27c58bff" containerName="registry-server" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.657159 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0055025f-d7c7-4469-9791-ffcb0bbdfef4" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.658585 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.661562 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.673611 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.750286 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.751982 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.753826 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/2773f02f-26c1-4c26-a789-afc299bd11c1-kube-api-access-l4mrd\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.753871 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.753911 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.753952 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.753987 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-run\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754033 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754063 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-sys\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754099 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-lib-modules\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754122 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754140 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754170 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-scripts\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754200 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754220 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-dev\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754258 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.754316 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.756549 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.761316 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.785098 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.786859 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.790535 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.802456 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.855971 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856027 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/2773f02f-26c1-4c26-a789-afc299bd11c1-kube-api-access-l4mrd\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856062 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856088 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-sys\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856110 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856130 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856144 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856168 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856203 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856310 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856366 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856404 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856486 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856525 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-run\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856640 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856653 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-run\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856676 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwmj\" (UniqueName: \"kubernetes.io/projected/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-kube-api-access-htwmj\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856794 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856834 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856867 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-sys\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856921 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-sys\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856937 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-lib-modules\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856964 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.856989 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857020 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857055 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-lib-modules\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857075 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857096 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857120 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-run\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857143 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-scripts\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857166 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9htq\" (UniqueName: \"kubernetes.io/projected/f0fe280a-4eaa-4dc5-8898-053826fd7131-kube-api-access-k9htq\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857187 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857205 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857222 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-dev\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857251 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857268 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857287 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-dev\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857280 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857315 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857333 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857353 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857353 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-dev\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857204 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857458 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857465 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857626 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857676 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857700 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857743 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2773f02f-26c1-4c26-a789-afc299bd11c1-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857778 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857815 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857830 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857899 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.857919 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.862753 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-scripts\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.862878 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.862984 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.871885 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mrd\" (UniqueName: \"kubernetes.io/projected/2773f02f-26c1-4c26-a789-afc299bd11c1-kube-api-access-l4mrd\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.873555 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2773f02f-26c1-4c26-a789-afc299bd11c1-config-data\") pod \"cinder-backup-0\" (UID: \"2773f02f-26c1-4c26-a789-afc299bd11c1\") " pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959386 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959442 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-sys\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959471 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959490 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959518 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959518 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-sys\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959534 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959561 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959564 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959581 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959596 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959592 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959623 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959575 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959681 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959693 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959749 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959776 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwmj\" (UniqueName: \"kubernetes.io/projected/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-kube-api-access-htwmj\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959862 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959878 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959956 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.959978 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960000 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960038 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-run\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960088 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9htq\" (UniqueName: \"kubernetes.io/projected/f0fe280a-4eaa-4dc5-8898-053826fd7131-kube-api-access-k9htq\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960110 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960142 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960163 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-run\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960169 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-dev\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960190 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-dev\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960212 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960216 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960243 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960279 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960330 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960381 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960466 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960467 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960480 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960507 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960526 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960546 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960552 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960685 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960707 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960722 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960747 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.960772 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f0fe280a-4eaa-4dc5-8898-053826fd7131-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.961040 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.963136 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.963944 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.964390 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.965423 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.965947 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.966244 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.966683 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.974127 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fe280a-4eaa-4dc5-8898-053826fd7131-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.984204 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.989176 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9htq\" (UniqueName: \"kubernetes.io/projected/f0fe280a-4eaa-4dc5-8898-053826fd7131-kube-api-access-k9htq\") pod \"cinder-volume-nfs-2-0\" (UID: \"f0fe280a-4eaa-4dc5-8898-053826fd7131\") " pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:42 crc kubenswrapper[4782]: I0130 19:13:42.990641 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwmj\" (UniqueName: \"kubernetes.io/projected/eacda6b1-72d6-4a27-9aa5-c0b01309e9d9-kube-api-access-htwmj\") pod \"cinder-volume-nfs-0\" (UID: \"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9\") " pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.079441 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.113322 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.607168 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.731818 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.804517 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2773f02f-26c1-4c26-a789-afc299bd11c1","Type":"ContainerStarted","Data":"9f3b1f5e314530ac0e090b6b01a9c513db35340fec4340dfbcbd9b57217475db"} Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.806360 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9","Type":"ContainerStarted","Data":"a8bac01339fd9d6738e02a6711a8c49da58b4ea16303b292cedff8b65a878c2c"} Jan 30 19:13:43 crc kubenswrapper[4782]: I0130 19:13:43.827599 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Jan 30 19:13:43 crc kubenswrapper[4782]: W0130 19:13:43.830028 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fe280a_4eaa_4dc5_8898_053826fd7131.slice/crio-4d7b9a16595a6242d39dc3201840e6bc7ef92dd7baa7826d60fbcf9db5a0745a WatchSource:0}: Error finding container 4d7b9a16595a6242d39dc3201840e6bc7ef92dd7baa7826d60fbcf9db5a0745a: Status 404 returned error can't find the container with id 4d7b9a16595a6242d39dc3201840e6bc7ef92dd7baa7826d60fbcf9db5a0745a Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.828676 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f0fe280a-4eaa-4dc5-8898-053826fd7131","Type":"ContainerStarted","Data":"8292c49b4d3d1e72344a88c052b5ff0235903012a2eea972965948d6af1bbadb"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.829054 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f0fe280a-4eaa-4dc5-8898-053826fd7131","Type":"ContainerStarted","Data":"c34b1a7db00656afdc4a194b6937073e79145edcd1744a069578411fcc3d7714"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.829064 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"f0fe280a-4eaa-4dc5-8898-053826fd7131","Type":"ContainerStarted","Data":"4d7b9a16595a6242d39dc3201840e6bc7ef92dd7baa7826d60fbcf9db5a0745a"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.834100 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2773f02f-26c1-4c26-a789-afc299bd11c1","Type":"ContainerStarted","Data":"b93f462e349c99a57827095db9c2eb58bf87a8d8513f8df2879fea80c2ec10f2"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.834132 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2773f02f-26c1-4c26-a789-afc299bd11c1","Type":"ContainerStarted","Data":"4e8ca84170d7c81baeb01b357bd8bb06e52a0ce38850782395165e63067a7519"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.838034 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9","Type":"ContainerStarted","Data":"73931405bd791728c0d85fa60da461446772ad2750b60203ad9077508c65a23a"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.838062 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"eacda6b1-72d6-4a27-9aa5-c0b01309e9d9","Type":"ContainerStarted","Data":"c1a02519fac3c11a64a5a12ca876296b5246a25a7cb365582469a75ff0bb83fe"} Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.857211 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.653639491 podStartE2EDuration="2.857194247s" podCreationTimestamp="2026-01-30 19:13:42 +0000 UTC" firstStartedPulling="2026-01-30 19:13:43.834620667 +0000 UTC m=+2600.102998692" lastFinishedPulling="2026-01-30 19:13:44.038175423 +0000 UTC m=+2600.306553448" observedRunningTime="2026-01-30 19:13:44.849838205 +0000 UTC m=+2601.118216250" watchObservedRunningTime="2026-01-30 19:13:44.857194247 +0000 UTC m=+2601.125572272" Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.893575 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.671017124 podStartE2EDuration="2.893551191s" podCreationTimestamp="2026-01-30 19:13:42 +0000 UTC" firstStartedPulling="2026-01-30 19:13:43.606505661 +0000 UTC m=+2599.874883686" lastFinishedPulling="2026-01-30 19:13:43.829039728 +0000 UTC m=+2600.097417753" observedRunningTime="2026-01-30 19:13:44.887763217 +0000 UTC m=+2601.156141242" watchObservedRunningTime="2026-01-30 19:13:44.893551191 +0000 UTC m=+2601.161929216" Jan 30 19:13:44 crc kubenswrapper[4782]: I0130 19:13:44.922948 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=2.678504099 podStartE2EDuration="2.92292772s" podCreationTimestamp="2026-01-30 19:13:42 +0000 UTC" firstStartedPulling="2026-01-30 19:13:43.783178759 +0000 UTC m=+2600.051556804" lastFinishedPulling="2026-01-30 19:13:44.0276024 +0000 UTC m=+2600.295980425" observedRunningTime="2026-01-30 19:13:44.911374283 +0000 UTC m=+2601.179752318" watchObservedRunningTime="2026-01-30 19:13:44.92292772 +0000 UTC m=+2601.191305745" Jan 30 19:13:47 crc kubenswrapper[4782]: I0130 19:13:47.985171 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 30 19:13:48 crc kubenswrapper[4782]: I0130 19:13:48.079694 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Jan 30 19:13:48 crc kubenswrapper[4782]: I0130 19:13:48.114251 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:52 crc kubenswrapper[4782]: I0130 19:13:52.411274 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:13:52 crc kubenswrapper[4782]: E0130 19:13:52.411989 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:13:53 crc kubenswrapper[4782]: I0130 19:13:53.137485 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 30 19:13:53 crc kubenswrapper[4782]: I0130 19:13:53.313249 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Jan 30 19:13:53 crc kubenswrapper[4782]: I0130 19:13:53.372699 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Jan 30 19:14:06 crc kubenswrapper[4782]: I0130 19:14:06.411672 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:14:06 crc kubenswrapper[4782]: E0130 19:14:06.412678 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:14:18 crc kubenswrapper[4782]: I0130 19:14:18.411411 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:14:18 crc kubenswrapper[4782]: E0130 19:14:18.412271 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:14:30 crc kubenswrapper[4782]: I0130 19:14:30.411366 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:14:30 crc kubenswrapper[4782]: E0130 19:14:30.412495 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.410971 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:14:42 crc kubenswrapper[4782]: E0130 19:14:42.412044 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.440130 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.440892 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="prometheus" containerID="cri-o://f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" gracePeriod=600 Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.440971 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="thanos-sidecar" containerID="cri-o://b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" gracePeriod=600 Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.440956 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="config-reloader" containerID="cri-o://819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" gracePeriod=600 Jan 30 19:14:42 crc kubenswrapper[4782]: I0130 19:14:42.605952 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.145:9090/-/ready\": dial tcp 10.217.0.145:9090: connect: connection refused" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.476217 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504385 4782 generic.go:334] "Generic (PLEG): container finished" podID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" exitCode=0 Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504422 4782 generic.go:334] "Generic (PLEG): container finished" podID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" exitCode=0 Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504433 4782 generic.go:334] "Generic (PLEG): container finished" podID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" exitCode=0 Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504436 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504455 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerDied","Data":"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9"} Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504479 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerDied","Data":"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f"} Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504513 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerDied","Data":"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51"} Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504532 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ddcc605c-425e-4a03-9fc6-7979c3cc341e","Type":"ContainerDied","Data":"97bfae0737ec94e8ed8db19019eb512491cf7b03968d31c7b7717785d3817977"} Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.504506 4782 scope.go:117] "RemoveContainer" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.528712 4782 scope.go:117] "RemoveContainer" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.573264 4782 scope.go:117] "RemoveContainer" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.598632 4782 scope.go:117] "RemoveContainer" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.622994 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623041 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623065 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623097 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623156 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623176 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623200 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623444 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623520 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623585 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpd96\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623615 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.623670 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config\") pod \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\" (UID: \"ddcc605c-425e-4a03-9fc6-7979c3cc341e\") " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.626423 4782 scope.go:117] "RemoveContainer" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.626826 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.626913 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.626982 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": container with ID starting with b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9 not found: ID does not exist" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.627034 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9"} err="failed to get container status \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": rpc error: code = NotFound desc = could not find container \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": container with ID starting with b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.627076 4782 scope.go:117] "RemoveContainer" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.627115 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.629341 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": container with ID starting with 819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f not found: ID does not exist" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.629390 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f"} err="failed to get container status \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": rpc error: code = NotFound desc = could not find container \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": container with ID starting with 819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.629419 4782 scope.go:117] "RemoveContainer" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.629806 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config" (OuterVolumeSpecName: "config") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.629993 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": container with ID starting with f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51 not found: ID does not exist" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630026 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51"} err="failed to get container status \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": rpc error: code = NotFound desc = could not find container \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": container with ID starting with f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630041 4782 scope.go:117] "RemoveContainer" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.630440 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": container with ID starting with b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555 not found: ID does not exist" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630471 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555"} err="failed to get container status \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": rpc error: code = NotFound desc = could not find container \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": container with ID starting with b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630491 4782 scope.go:117] "RemoveContainer" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630703 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630969 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9"} err="failed to get container status \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": rpc error: code = NotFound desc = could not find container \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": container with ID starting with b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.630997 4782 scope.go:117] "RemoveContainer" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631210 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out" (OuterVolumeSpecName: "config-out") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631284 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631304 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f"} err="failed to get container status \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": rpc error: code = NotFound desc = could not find container \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": container with ID starting with 819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631325 4782 scope.go:117] "RemoveContainer" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631531 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51"} err="failed to get container status \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": rpc error: code = NotFound desc = could not find container \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": container with ID starting with f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.631559 4782 scope.go:117] "RemoveContainer" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.632546 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.632575 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.633498 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555"} err="failed to get container status \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": rpc error: code = NotFound desc = could not find container \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": container with ID starting with b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.633526 4782 scope.go:117] "RemoveContainer" containerID="b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634139 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9"} err="failed to get container status \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": rpc error: code = NotFound desc = could not find container \"b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9\": container with ID starting with b67fa12163bc46bf9d51da2a132fbc59fff2466151eb3340b5792600ff5d41d9 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634167 4782 scope.go:117] "RemoveContainer" containerID="819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634514 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f"} err="failed to get container status \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": rpc error: code = NotFound desc = could not find container \"819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f\": container with ID starting with 819ecd3ee8c948a1dee65bd3ed29d1e18b316469ae849e258c61b53168155e6f not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634541 4782 scope.go:117] "RemoveContainer" containerID="f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634754 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51"} err="failed to get container status \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": rpc error: code = NotFound desc = could not find container \"f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51\": container with ID starting with f8a2bbe0740ccd72a500215e68e9b2b90c26f388d1f849843b27ac1b84b75d51 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.634783 4782 scope.go:117] "RemoveContainer" containerID="b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.635401 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555"} err="failed to get container status \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": rpc error: code = NotFound desc = could not find container \"b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555\": container with ID starting with b00089a8854d6e29e014b792f3a085e8c03984b9f71760803af5ad88d199c555 not found: ID does not exist" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.637345 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.640406 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96" (OuterVolumeSpecName: "kube-api-access-lpd96") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "kube-api-access-lpd96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.658249 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "pvc-10319900-5721-45e2-9485-947fcfd18ab3". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727283 4782 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727317 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpd96\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-kube-api-access-lpd96\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727329 4782 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727339 4782 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727348 4782 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727362 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727373 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727382 4782 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config-out\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727741 4782 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-config\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727758 4782 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ddcc605c-425e-4a03-9fc6-7979c3cc341e-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727796 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") on node \"crc\" " Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.727809 4782 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ddcc605c-425e-4a03-9fc6-7979c3cc341e-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.740024 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config" (OuterVolumeSpecName: "web-config") pod "ddcc605c-425e-4a03-9fc6-7979c3cc341e" (UID: "ddcc605c-425e-4a03-9fc6-7979c3cc341e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.766026 4782 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.766800 4782 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10319900-5721-45e2-9485-947fcfd18ab3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3") on node "crc" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.830688 4782 reconciler_common.go:293] "Volume detached for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.830727 4782 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ddcc605c-425e-4a03-9fc6-7979c3cc341e-web-config\") on node \"crc\" DevicePath \"\"" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.841363 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.850369 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867077 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.867518 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="prometheus" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867534 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="prometheus" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.867543 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="init-config-reloader" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867550 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="init-config-reloader" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.867572 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="thanos-sidecar" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867579 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="thanos-sidecar" Jan 30 19:14:43 crc kubenswrapper[4782]: E0130 19:14:43.867591 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="config-reloader" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867597 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="config-reloader" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867770 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="config-reloader" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867791 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="thanos-sidecar" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.867800 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" containerName="prometheus" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.869535 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.872132 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.872382 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4h9n8" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.872520 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.873450 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.875539 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.875640 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.880301 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.886813 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:43 crc kubenswrapper[4782]: I0130 19:14:43.891278 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034278 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034490 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034684 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034754 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034807 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034852 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034922 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034949 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp7t6\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-kube-api-access-zp7t6\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.034984 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.035007 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.035032 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.035135 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/620ae2cd-1705-4975-92e7-32c6b559c37d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.035204 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.136991 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137142 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137213 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137272 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137304 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137331 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137368 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137391 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp7t6\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-kube-api-access-zp7t6\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137438 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137472 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137504 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137536 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/620ae2cd-1705-4975-92e7-32c6b559c37d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.137563 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.138337 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.139118 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.140942 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.142724 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.142795 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.143567 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.143679 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/620ae2cd-1705-4975-92e7-32c6b559c37d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.146446 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/620ae2cd-1705-4975-92e7-32c6b559c37d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.147085 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.147837 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.148548 4782 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.148673 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7160aa072a4d7b723e1d7d729bf0a8cb68e388f5eb2d179074e77609eba87da8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.155827 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620ae2cd-1705-4975-92e7-32c6b559c37d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.174475 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp7t6\" (UniqueName: \"kubernetes.io/projected/620ae2cd-1705-4975-92e7-32c6b559c37d-kube-api-access-zp7t6\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.223860 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10319900-5721-45e2-9485-947fcfd18ab3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10319900-5721-45e2-9485-947fcfd18ab3\") pod \"prometheus-metric-storage-0\" (UID: \"620ae2cd-1705-4975-92e7-32c6b559c37d\") " pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.430836 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcc605c-425e-4a03-9fc6-7979c3cc341e" path="/var/lib/kubelet/pods/ddcc605c-425e-4a03-9fc6-7979c3cc341e/volumes" Jan 30 19:14:44 crc kubenswrapper[4782]: I0130 19:14:44.485993 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 30 19:14:45 crc kubenswrapper[4782]: I0130 19:14:45.069817 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 30 19:14:45 crc kubenswrapper[4782]: W0130 19:14:45.081118 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod620ae2cd_1705_4975_92e7_32c6b559c37d.slice/crio-1e232944b64b7a08db58e45702ca2b8d239d067ebe35d40005b07b0ac9eeb04c WatchSource:0}: Error finding container 1e232944b64b7a08db58e45702ca2b8d239d067ebe35d40005b07b0ac9eeb04c: Status 404 returned error can't find the container with id 1e232944b64b7a08db58e45702ca2b8d239d067ebe35d40005b07b0ac9eeb04c Jan 30 19:14:45 crc kubenswrapper[4782]: I0130 19:14:45.536649 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerStarted","Data":"1e232944b64b7a08db58e45702ca2b8d239d067ebe35d40005b07b0ac9eeb04c"} Jan 30 19:14:49 crc kubenswrapper[4782]: I0130 19:14:49.586273 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerStarted","Data":"8164dced3302e4d83f2dcc08967c12ca039a1debb89e2d9adf9414a2f14d7c18"} Jan 30 19:14:56 crc kubenswrapper[4782]: I0130 19:14:56.657688 4782 generic.go:334] "Generic (PLEG): container finished" podID="620ae2cd-1705-4975-92e7-32c6b559c37d" containerID="8164dced3302e4d83f2dcc08967c12ca039a1debb89e2d9adf9414a2f14d7c18" exitCode=0 Jan 30 19:14:56 crc kubenswrapper[4782]: I0130 19:14:56.657790 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerDied","Data":"8164dced3302e4d83f2dcc08967c12ca039a1debb89e2d9adf9414a2f14d7c18"} Jan 30 19:14:57 crc kubenswrapper[4782]: I0130 19:14:57.411375 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:14:57 crc kubenswrapper[4782]: E0130 19:14:57.412400 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:14:57 crc kubenswrapper[4782]: I0130 19:14:57.671975 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerStarted","Data":"ac0a83857438a022a63b558b547f1744749d7d6b58a1f019a1c3bf900f17323a"} Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.161827 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf"] Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.163578 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.167697 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.167697 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.175915 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf"] Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.326677 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.326736 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfhd\" (UniqueName: \"kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.327309 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.430676 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.431433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfhd\" (UniqueName: \"kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.431998 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.432873 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.449160 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.454308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfhd\" (UniqueName: \"kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd\") pod \"collect-profiles-29496675-f8nsf\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:00 crc kubenswrapper[4782]: I0130 19:15:00.499690 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:01 crc kubenswrapper[4782]: W0130 19:15:01.086113 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23bfc7a2_8336_4871_b013_9a7e1212ea8a.slice/crio-d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8 WatchSource:0}: Error finding container d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8: Status 404 returned error can't find the container with id d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8 Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.091659 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf"] Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.778064 4782 generic.go:334] "Generic (PLEG): container finished" podID="23bfc7a2-8336-4871-b013-9a7e1212ea8a" containerID="84df9e8415c13230b6fab6bbf8dad240f12410e21f10cc104e0b880bb4f5223c" exitCode=0 Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.778214 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" event={"ID":"23bfc7a2-8336-4871-b013-9a7e1212ea8a","Type":"ContainerDied","Data":"84df9e8415c13230b6fab6bbf8dad240f12410e21f10cc104e0b880bb4f5223c"} Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.778508 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" event={"ID":"23bfc7a2-8336-4871-b013-9a7e1212ea8a","Type":"ContainerStarted","Data":"d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8"} Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.781977 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerStarted","Data":"12dab64ff0c45dbecf8aafdd323ee4cc9e65a5619aa1746d83e75a37234b0464"} Jan 30 19:15:01 crc kubenswrapper[4782]: I0130 19:15:01.782035 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"620ae2cd-1705-4975-92e7-32c6b559c37d","Type":"ContainerStarted","Data":"43f37e44685cf03fb76defbd0f68b21f72fe115341d08fd07c5ccea3a927f432"} Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.220980 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.244322 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.244295982 podStartE2EDuration="20.244295982s" podCreationTimestamp="2026-01-30 19:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 19:15:01.836453461 +0000 UTC m=+2678.104831506" watchObservedRunningTime="2026-01-30 19:15:03.244295982 +0000 UTC m=+2679.512674017" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.301193 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume\") pod \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.301310 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume\") pod \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.301413 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfhd\" (UniqueName: \"kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd\") pod \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\" (UID: \"23bfc7a2-8336-4871-b013-9a7e1212ea8a\") " Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.302162 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume" (OuterVolumeSpecName: "config-volume") pod "23bfc7a2-8336-4871-b013-9a7e1212ea8a" (UID: "23bfc7a2-8336-4871-b013-9a7e1212ea8a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.302482 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23bfc7a2-8336-4871-b013-9a7e1212ea8a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.308292 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd" (OuterVolumeSpecName: "kube-api-access-9dfhd") pod "23bfc7a2-8336-4871-b013-9a7e1212ea8a" (UID: "23bfc7a2-8336-4871-b013-9a7e1212ea8a"). InnerVolumeSpecName "kube-api-access-9dfhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.308707 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23bfc7a2-8336-4871-b013-9a7e1212ea8a" (UID: "23bfc7a2-8336-4871-b013-9a7e1212ea8a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.404267 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23bfc7a2-8336-4871-b013-9a7e1212ea8a-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.404305 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfhd\" (UniqueName: \"kubernetes.io/projected/23bfc7a2-8336-4871-b013-9a7e1212ea8a-kube-api-access-9dfhd\") on node \"crc\" DevicePath \"\"" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.807859 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" event={"ID":"23bfc7a2-8336-4871-b013-9a7e1212ea8a","Type":"ContainerDied","Data":"d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8"} Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.807918 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf" Jan 30 19:15:03 crc kubenswrapper[4782]: I0130 19:15:03.807931 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4c63b30c525348071a573ea980aa927c47735c3c653ffd49da6f03c8f859cf8" Jan 30 19:15:04 crc kubenswrapper[4782]: I0130 19:15:04.317501 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr"] Jan 30 19:15:04 crc kubenswrapper[4782]: I0130 19:15:04.329860 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496630-tq7nr"] Jan 30 19:15:04 crc kubenswrapper[4782]: I0130 19:15:04.422559 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699e1980-2c0b-4f99-8977-3ce25d99f142" path="/var/lib/kubelet/pods/699e1980-2c0b-4f99-8977-3ce25d99f142/volumes" Jan 30 19:15:04 crc kubenswrapper[4782]: I0130 19:15:04.486302 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 30 19:15:11 crc kubenswrapper[4782]: I0130 19:15:11.410965 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:15:11 crc kubenswrapper[4782]: E0130 19:15:11.412556 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:15:14 crc kubenswrapper[4782]: I0130 19:15:14.486640 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 30 19:15:14 crc kubenswrapper[4782]: I0130 19:15:14.497131 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 30 19:15:14 crc kubenswrapper[4782]: I0130 19:15:14.927304 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.276844 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 19:15:22 crc kubenswrapper[4782]: E0130 19:15:22.277935 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bfc7a2-8336-4871-b013-9a7e1212ea8a" containerName="collect-profiles" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.277950 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bfc7a2-8336-4871-b013-9a7e1212ea8a" containerName="collect-profiles" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.278153 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bfc7a2-8336-4871-b013-9a7e1212ea8a" containerName="collect-profiles" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.278937 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.281337 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nw6qc" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.281570 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.282350 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.282508 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.285763 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.342746 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.342797 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.342990 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.412127 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.459185 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.459689 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.459811 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.459867 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.460071 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.460116 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.460148 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.460184 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.460220 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfzw\" (UniqueName: \"kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.461602 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.461805 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.465664 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562124 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562275 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562320 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562370 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfzw\" (UniqueName: \"kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562408 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562451 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.562635 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.563033 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.563308 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.566384 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.570964 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.599841 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfzw\" (UniqueName: \"kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.602837 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " pod="openstack/tempest-tests-tempest" Jan 30 19:15:22 crc kubenswrapper[4782]: I0130 19:15:22.649362 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 19:15:23 crc kubenswrapper[4782]: I0130 19:15:23.028049 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343"} Jan 30 19:15:23 crc kubenswrapper[4782]: I0130 19:15:23.111417 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 19:15:24 crc kubenswrapper[4782]: I0130 19:15:24.038831 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd740cf2-1846-4d1e-902e-6ba7a54c0019","Type":"ContainerStarted","Data":"5f15ce70cff01e25391c19264f218249c42956282f458b6bd43d8f861a39d5ed"} Jan 30 19:15:25 crc kubenswrapper[4782]: I0130 19:15:25.755071 4782 scope.go:117] "RemoveContainer" containerID="e013d0bfd822f7553ec9d48d35b58477e4484680b2711ed884d6e4ddf3ff94c2" Jan 30 19:15:35 crc kubenswrapper[4782]: I0130 19:15:35.368094 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 19:15:37 crc kubenswrapper[4782]: I0130 19:15:37.176472 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd740cf2-1846-4d1e-902e-6ba7a54c0019","Type":"ContainerStarted","Data":"060e6f75c5e0eaf06952daf495fdb6db426d19d4830db6d85957ec0611706f8d"} Jan 30 19:15:37 crc kubenswrapper[4782]: I0130 19:15:37.204791 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.957473641 podStartE2EDuration="16.204767198s" podCreationTimestamp="2026-01-30 19:15:21 +0000 UTC" firstStartedPulling="2026-01-30 19:15:23.116487224 +0000 UTC m=+2699.384865249" lastFinishedPulling="2026-01-30 19:15:35.363780781 +0000 UTC m=+2711.632158806" observedRunningTime="2026-01-30 19:15:37.204745277 +0000 UTC m=+2713.473123322" watchObservedRunningTime="2026-01-30 19:15:37.204767198 +0000 UTC m=+2713.473145243" Jan 30 19:17:49 crc kubenswrapper[4782]: I0130 19:17:49.792493 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:17:49 crc kubenswrapper[4782]: I0130 19:17:49.793195 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.420025 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.424386 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.435556 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.519998 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.520698 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.520826 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mbd\" (UniqueName: \"kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.622765 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.622847 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mbd\" (UniqueName: \"kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.622939 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.623385 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.623499 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.649167 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mbd\" (UniqueName: \"kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd\") pod \"certified-operators-pc6hl\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:09 crc kubenswrapper[4782]: I0130 19:18:09.748296 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:10 crc kubenswrapper[4782]: I0130 19:18:10.484217 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:10 crc kubenswrapper[4782]: I0130 19:18:10.883304 4782 generic.go:334] "Generic (PLEG): container finished" podID="1820c759-018d-4283-9f5b-e06fdda4a367" containerID="ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2" exitCode=0 Jan 30 19:18:10 crc kubenswrapper[4782]: I0130 19:18:10.883350 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerDied","Data":"ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2"} Jan 30 19:18:10 crc kubenswrapper[4782]: I0130 19:18:10.883624 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerStarted","Data":"77591b05a17013a95dc107a306dee05fad0e1e3a40ac36de0146c47f1d35ed79"} Jan 30 19:18:10 crc kubenswrapper[4782]: I0130 19:18:10.885138 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:18:11 crc kubenswrapper[4782]: I0130 19:18:11.897314 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerStarted","Data":"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007"} Jan 30 19:18:14 crc kubenswrapper[4782]: I0130 19:18:14.931436 4782 generic.go:334] "Generic (PLEG): container finished" podID="1820c759-018d-4283-9f5b-e06fdda4a367" containerID="e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007" exitCode=0 Jan 30 19:18:14 crc kubenswrapper[4782]: I0130 19:18:14.931639 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerDied","Data":"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007"} Jan 30 19:18:15 crc kubenswrapper[4782]: I0130 19:18:15.972619 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerStarted","Data":"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a"} Jan 30 19:18:16 crc kubenswrapper[4782]: I0130 19:18:16.005959 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pc6hl" podStartSLOduration=2.539934749 podStartE2EDuration="7.005934691s" podCreationTimestamp="2026-01-30 19:18:09 +0000 UTC" firstStartedPulling="2026-01-30 19:18:10.884898434 +0000 UTC m=+2867.153276459" lastFinishedPulling="2026-01-30 19:18:15.350898336 +0000 UTC m=+2871.619276401" observedRunningTime="2026-01-30 19:18:15.995021809 +0000 UTC m=+2872.263399864" watchObservedRunningTime="2026-01-30 19:18:16.005934691 +0000 UTC m=+2872.274312746" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.748882 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.750178 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.791005 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.792910 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.792988 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.799768 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.806171 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.841097 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.895270 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.895377 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.895448 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99lq\" (UniqueName: \"kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.997030 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.997088 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.997120 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99lq\" (UniqueName: \"kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.997830 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:19 crc kubenswrapper[4782]: I0130 19:18:19.998045 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:20 crc kubenswrapper[4782]: I0130 19:18:20.020927 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99lq\" (UniqueName: \"kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq\") pod \"redhat-operators-fjc4s\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:20 crc kubenswrapper[4782]: I0130 19:18:20.069848 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:20 crc kubenswrapper[4782]: I0130 19:18:20.139487 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:20 crc kubenswrapper[4782]: I0130 19:18:20.603876 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:21 crc kubenswrapper[4782]: I0130 19:18:21.025067 4782 generic.go:334] "Generic (PLEG): container finished" podID="de288e0e-2d9e-483c-9012-211efe15573c" containerID="ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0" exitCode=0 Jan 30 19:18:21 crc kubenswrapper[4782]: I0130 19:18:21.026906 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerDied","Data":"ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0"} Jan 30 19:18:21 crc kubenswrapper[4782]: I0130 19:18:21.026938 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerStarted","Data":"89c226aec75a1ab508777b85b6528fa5591f105116a2a4a99be42100cd75a576"} Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.038354 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerStarted","Data":"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22"} Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.138039 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.138300 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pc6hl" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="registry-server" containerID="cri-o://f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a" gracePeriod=2 Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.742095 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.860646 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content\") pod \"1820c759-018d-4283-9f5b-e06fdda4a367\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.860744 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities\") pod \"1820c759-018d-4283-9f5b-e06fdda4a367\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.860788 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7mbd\" (UniqueName: \"kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd\") pod \"1820c759-018d-4283-9f5b-e06fdda4a367\" (UID: \"1820c759-018d-4283-9f5b-e06fdda4a367\") " Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.861872 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities" (OuterVolumeSpecName: "utilities") pod "1820c759-018d-4283-9f5b-e06fdda4a367" (UID: "1820c759-018d-4283-9f5b-e06fdda4a367"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.872870 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd" (OuterVolumeSpecName: "kube-api-access-s7mbd") pod "1820c759-018d-4283-9f5b-e06fdda4a367" (UID: "1820c759-018d-4283-9f5b-e06fdda4a367"). InnerVolumeSpecName "kube-api-access-s7mbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.912582 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1820c759-018d-4283-9f5b-e06fdda4a367" (UID: "1820c759-018d-4283-9f5b-e06fdda4a367"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.963969 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.964001 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1820c759-018d-4283-9f5b-e06fdda4a367-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:22 crc kubenswrapper[4782]: I0130 19:18:22.964014 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7mbd\" (UniqueName: \"kubernetes.io/projected/1820c759-018d-4283-9f5b-e06fdda4a367-kube-api-access-s7mbd\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.053584 4782 generic.go:334] "Generic (PLEG): container finished" podID="1820c759-018d-4283-9f5b-e06fdda4a367" containerID="f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a" exitCode=0 Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.053673 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc6hl" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.053689 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerDied","Data":"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a"} Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.053761 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc6hl" event={"ID":"1820c759-018d-4283-9f5b-e06fdda4a367","Type":"ContainerDied","Data":"77591b05a17013a95dc107a306dee05fad0e1e3a40ac36de0146c47f1d35ed79"} Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.053786 4782 scope.go:117] "RemoveContainer" containerID="f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.093007 4782 scope.go:117] "RemoveContainer" containerID="e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.103322 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.111804 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pc6hl"] Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.118374 4782 scope.go:117] "RemoveContainer" containerID="ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.172245 4782 scope.go:117] "RemoveContainer" containerID="f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a" Jan 30 19:18:23 crc kubenswrapper[4782]: E0130 19:18:23.172747 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a\": container with ID starting with f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a not found: ID does not exist" containerID="f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.172791 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a"} err="failed to get container status \"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a\": rpc error: code = NotFound desc = could not find container \"f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a\": container with ID starting with f193154ff7590f745edd9f5a4952d725fad5e7cb5d1d5dbe34899b9a1cd7a72a not found: ID does not exist" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.172825 4782 scope.go:117] "RemoveContainer" containerID="e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007" Jan 30 19:18:23 crc kubenswrapper[4782]: E0130 19:18:23.173172 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007\": container with ID starting with e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007 not found: ID does not exist" containerID="e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.173206 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007"} err="failed to get container status \"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007\": rpc error: code = NotFound desc = could not find container \"e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007\": container with ID starting with e47b7c86f56e9da57b63d1fef60ae043c0e66728faf0c8a702c668108c220007 not found: ID does not exist" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.173224 4782 scope.go:117] "RemoveContainer" containerID="ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2" Jan 30 19:18:23 crc kubenswrapper[4782]: E0130 19:18:23.174047 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2\": container with ID starting with ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2 not found: ID does not exist" containerID="ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2" Jan 30 19:18:23 crc kubenswrapper[4782]: I0130 19:18:23.174089 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2"} err="failed to get container status \"ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2\": rpc error: code = NotFound desc = could not find container \"ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2\": container with ID starting with ff7a23046343ac4495881ef623232e82bf823ea85e57767813d9a1c9f43754d2 not found: ID does not exist" Jan 30 19:18:24 crc kubenswrapper[4782]: I0130 19:18:24.424699 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" path="/var/lib/kubelet/pods/1820c759-018d-4283-9f5b-e06fdda4a367/volumes" Jan 30 19:18:30 crc kubenswrapper[4782]: E0130 19:18:30.384080 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde288e0e_2d9e_483c_9012_211efe15573c.slice/crio-34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22.scope\": RecentStats: unable to find data in memory cache]" Jan 30 19:18:31 crc kubenswrapper[4782]: I0130 19:18:31.157709 4782 generic.go:334] "Generic (PLEG): container finished" podID="de288e0e-2d9e-483c-9012-211efe15573c" containerID="34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22" exitCode=0 Jan 30 19:18:31 crc kubenswrapper[4782]: I0130 19:18:31.157809 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerDied","Data":"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22"} Jan 30 19:18:32 crc kubenswrapper[4782]: I0130 19:18:32.175961 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerStarted","Data":"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471"} Jan 30 19:18:32 crc kubenswrapper[4782]: I0130 19:18:32.209912 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fjc4s" podStartSLOduration=2.665032495 podStartE2EDuration="13.209893083s" podCreationTimestamp="2026-01-30 19:18:19 +0000 UTC" firstStartedPulling="2026-01-30 19:18:21.02769742 +0000 UTC m=+2877.296075455" lastFinishedPulling="2026-01-30 19:18:31.572558028 +0000 UTC m=+2887.840936043" observedRunningTime="2026-01-30 19:18:32.204782026 +0000 UTC m=+2888.473160081" watchObservedRunningTime="2026-01-30 19:18:32.209893083 +0000 UTC m=+2888.478271118" Jan 30 19:18:40 crc kubenswrapper[4782]: I0130 19:18:40.140308 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:40 crc kubenswrapper[4782]: I0130 19:18:40.140901 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:41 crc kubenswrapper[4782]: I0130 19:18:41.189534 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fjc4s" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="registry-server" probeResult="failure" output=< Jan 30 19:18:41 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:18:41 crc kubenswrapper[4782]: > Jan 30 19:18:49 crc kubenswrapper[4782]: I0130 19:18:49.792879 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:18:49 crc kubenswrapper[4782]: I0130 19:18:49.793670 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:18:49 crc kubenswrapper[4782]: I0130 19:18:49.793743 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:18:49 crc kubenswrapper[4782]: I0130 19:18:49.794998 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:18:49 crc kubenswrapper[4782]: I0130 19:18:49.795114 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343" gracePeriod=600 Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.194291 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.250647 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.353472 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343" exitCode=0 Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.353509 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343"} Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.353552 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70"} Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.353581 4782 scope.go:117] "RemoveContainer" containerID="aa96ae91016937b922ee44a935014cf453a211563b97c7a6672806cb2f4bf052" Jan 30 19:18:50 crc kubenswrapper[4782]: I0130 19:18:50.972957 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.369607 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fjc4s" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="registry-server" containerID="cri-o://88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471" gracePeriod=2 Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.806643 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.906928 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99lq\" (UniqueName: \"kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq\") pod \"de288e0e-2d9e-483c-9012-211efe15573c\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.907001 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities\") pod \"de288e0e-2d9e-483c-9012-211efe15573c\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.907085 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content\") pod \"de288e0e-2d9e-483c-9012-211efe15573c\" (UID: \"de288e0e-2d9e-483c-9012-211efe15573c\") " Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.908443 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities" (OuterVolumeSpecName: "utilities") pod "de288e0e-2d9e-483c-9012-211efe15573c" (UID: "de288e0e-2d9e-483c-9012-211efe15573c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:18:51 crc kubenswrapper[4782]: I0130 19:18:51.918631 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq" (OuterVolumeSpecName: "kube-api-access-l99lq") pod "de288e0e-2d9e-483c-9012-211efe15573c" (UID: "de288e0e-2d9e-483c-9012-211efe15573c"). InnerVolumeSpecName "kube-api-access-l99lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.010058 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99lq\" (UniqueName: \"kubernetes.io/projected/de288e0e-2d9e-483c-9012-211efe15573c-kube-api-access-l99lq\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.011109 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.033958 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de288e0e-2d9e-483c-9012-211efe15573c" (UID: "de288e0e-2d9e-483c-9012-211efe15573c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.113065 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de288e0e-2d9e-483c-9012-211efe15573c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.387207 4782 generic.go:334] "Generic (PLEG): container finished" podID="de288e0e-2d9e-483c-9012-211efe15573c" containerID="88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471" exitCode=0 Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.387319 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerDied","Data":"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471"} Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.387350 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fjc4s" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.387395 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fjc4s" event={"ID":"de288e0e-2d9e-483c-9012-211efe15573c","Type":"ContainerDied","Data":"89c226aec75a1ab508777b85b6528fa5591f105116a2a4a99be42100cd75a576"} Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.387436 4782 scope.go:117] "RemoveContainer" containerID="88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.417710 4782 scope.go:117] "RemoveContainer" containerID="34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.449736 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.458539 4782 scope.go:117] "RemoveContainer" containerID="ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.462800 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fjc4s"] Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.519935 4782 scope.go:117] "RemoveContainer" containerID="88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471" Jan 30 19:18:52 crc kubenswrapper[4782]: E0130 19:18:52.520429 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471\": container with ID starting with 88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471 not found: ID does not exist" containerID="88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.520464 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471"} err="failed to get container status \"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471\": rpc error: code = NotFound desc = could not find container \"88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471\": container with ID starting with 88885c90219fd984b952ee35b912f5e202f60443dc1b93d43b232a22d417e471 not found: ID does not exist" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.520489 4782 scope.go:117] "RemoveContainer" containerID="34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22" Jan 30 19:18:52 crc kubenswrapper[4782]: E0130 19:18:52.520798 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22\": container with ID starting with 34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22 not found: ID does not exist" containerID="34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.520821 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22"} err="failed to get container status \"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22\": rpc error: code = NotFound desc = could not find container \"34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22\": container with ID starting with 34cf99ae93f67ee4bee37f0265e05d86cbf1ede5545023b1d922b17ea3581b22 not found: ID does not exist" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.520838 4782 scope.go:117] "RemoveContainer" containerID="ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0" Jan 30 19:18:52 crc kubenswrapper[4782]: E0130 19:18:52.521028 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0\": container with ID starting with ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0 not found: ID does not exist" containerID="ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0" Jan 30 19:18:52 crc kubenswrapper[4782]: I0130 19:18:52.521045 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0"} err="failed to get container status \"ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0\": rpc error: code = NotFound desc = could not find container \"ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0\": container with ID starting with ab6650583a666835d2926e837e5e16c9e4dafb6b3b63ed566997161bc4eecfd0 not found: ID does not exist" Jan 30 19:18:54 crc kubenswrapper[4782]: I0130 19:18:54.430565 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de288e0e-2d9e-483c-9012-211efe15573c" path="/var/lib/kubelet/pods/de288e0e-2d9e-483c-9012-211efe15573c/volumes" Jan 30 19:21:19 crc kubenswrapper[4782]: I0130 19:21:19.792436 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:21:19 crc kubenswrapper[4782]: I0130 19:21:19.793399 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:21:49 crc kubenswrapper[4782]: I0130 19:21:49.793146 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:21:49 crc kubenswrapper[4782]: I0130 19:21:49.794536 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:22:19 crc kubenswrapper[4782]: I0130 19:22:19.793033 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:22:19 crc kubenswrapper[4782]: I0130 19:22:19.793675 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:22:19 crc kubenswrapper[4782]: I0130 19:22:19.793726 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:22:19 crc kubenswrapper[4782]: I0130 19:22:19.794440 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:22:19 crc kubenswrapper[4782]: I0130 19:22:19.794512 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" gracePeriod=600 Jan 30 19:22:19 crc kubenswrapper[4782]: E0130 19:22:19.932271 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:22:20 crc kubenswrapper[4782]: I0130 19:22:20.151921 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" exitCode=0 Jan 30 19:22:20 crc kubenswrapper[4782]: I0130 19:22:20.151967 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70"} Jan 30 19:22:20 crc kubenswrapper[4782]: I0130 19:22:20.152000 4782 scope.go:117] "RemoveContainer" containerID="8ec9579f4fc94dcfb95b593bbf0ffe5f79a43b59ecbe4dc18dae76870cb80343" Jan 30 19:22:20 crc kubenswrapper[4782]: I0130 19:22:20.153267 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:22:20 crc kubenswrapper[4782]: E0130 19:22:20.153999 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:22:33 crc kubenswrapper[4782]: I0130 19:22:33.410904 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:22:33 crc kubenswrapper[4782]: E0130 19:22:33.411732 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:22:47 crc kubenswrapper[4782]: I0130 19:22:47.411666 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:22:47 crc kubenswrapper[4782]: E0130 19:22:47.412423 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:22:59 crc kubenswrapper[4782]: I0130 19:22:59.411097 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:22:59 crc kubenswrapper[4782]: E0130 19:22:59.412085 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.043630 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046114 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="extract-utilities" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046173 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="extract-utilities" Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046192 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046205 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046282 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="extract-utilities" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046299 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="extract-utilities" Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046330 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="extract-content" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046345 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="extract-content" Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046378 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046391 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: E0130 19:23:13.046428 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="extract-content" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046440 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="extract-content" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046766 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="de288e0e-2d9e-483c-9012-211efe15573c" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.046817 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1820c759-018d-4283-9f5b-e06fdda4a367" containerName="registry-server" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.049436 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.061250 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.217097 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.219663 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.231896 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.236449 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.236661 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.236701 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcg8\" (UniqueName: \"kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.337989 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338048 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338110 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcg8\" (UniqueName: \"kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338175 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgczt\" (UniqueName: \"kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338194 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338290 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338720 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.338731 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.359690 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcg8\" (UniqueName: \"kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8\") pod \"redhat-marketplace-q2r6f\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.404159 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.439780 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgczt\" (UniqueName: \"kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.440159 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.440266 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.440879 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.441831 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.459288 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgczt\" (UniqueName: \"kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt\") pod \"community-operators-r8jtt\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:13 crc kubenswrapper[4782]: I0130 19:23:13.551466 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.049266 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.435001 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:23:14 crc kubenswrapper[4782]: E0130 19:23:14.437282 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.759207 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.801336 4782 generic.go:334] "Generic (PLEG): container finished" podID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerID="75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184" exitCode=0 Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.801689 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerDied","Data":"75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184"} Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.801717 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerStarted","Data":"a874d90e7135cde6e5f11b09441997730d9871f65fd3821e0e0d07ba47496f02"} Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.803969 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:23:14 crc kubenswrapper[4782]: I0130 19:23:14.806461 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerStarted","Data":"f23bdf516b2c1f70351987fe05eff1fb0c77b3a30287a9b2f7c9eaed584dbcb1"} Jan 30 19:23:15 crc kubenswrapper[4782]: I0130 19:23:15.817323 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerStarted","Data":"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee"} Jan 30 19:23:15 crc kubenswrapper[4782]: I0130 19:23:15.819211 4782 generic.go:334] "Generic (PLEG): container finished" podID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerID="3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42" exitCode=0 Jan 30 19:23:15 crc kubenswrapper[4782]: I0130 19:23:15.819262 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerDied","Data":"3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42"} Jan 30 19:23:16 crc kubenswrapper[4782]: I0130 19:23:16.829146 4782 generic.go:334] "Generic (PLEG): container finished" podID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerID="28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee" exitCode=0 Jan 30 19:23:16 crc kubenswrapper[4782]: I0130 19:23:16.829203 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerDied","Data":"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee"} Jan 30 19:23:17 crc kubenswrapper[4782]: I0130 19:23:17.844748 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerStarted","Data":"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9"} Jan 30 19:23:17 crc kubenswrapper[4782]: I0130 19:23:17.847551 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerStarted","Data":"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86"} Jan 30 19:23:17 crc kubenswrapper[4782]: I0130 19:23:17.867956 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q2r6f" podStartSLOduration=2.384992656 podStartE2EDuration="4.867938145s" podCreationTimestamp="2026-01-30 19:23:13 +0000 UTC" firstStartedPulling="2026-01-30 19:23:14.803765752 +0000 UTC m=+3171.072143767" lastFinishedPulling="2026-01-30 19:23:17.286711201 +0000 UTC m=+3173.555089256" observedRunningTime="2026-01-30 19:23:17.862718466 +0000 UTC m=+3174.131096501" watchObservedRunningTime="2026-01-30 19:23:17.867938145 +0000 UTC m=+3174.136316170" Jan 30 19:23:18 crc kubenswrapper[4782]: I0130 19:23:18.860624 4782 generic.go:334] "Generic (PLEG): container finished" podID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerID="4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86" exitCode=0 Jan 30 19:23:18 crc kubenswrapper[4782]: I0130 19:23:18.860749 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerDied","Data":"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86"} Jan 30 19:23:19 crc kubenswrapper[4782]: I0130 19:23:19.871966 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerStarted","Data":"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba"} Jan 30 19:23:19 crc kubenswrapper[4782]: I0130 19:23:19.892867 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8jtt" podStartSLOduration=3.445288564 podStartE2EDuration="6.892848977s" podCreationTimestamp="2026-01-30 19:23:13 +0000 UTC" firstStartedPulling="2026-01-30 19:23:15.821104508 +0000 UTC m=+3172.089482533" lastFinishedPulling="2026-01-30 19:23:19.268664911 +0000 UTC m=+3175.537042946" observedRunningTime="2026-01-30 19:23:19.890572491 +0000 UTC m=+3176.158950516" watchObservedRunningTime="2026-01-30 19:23:19.892848977 +0000 UTC m=+3176.161227002" Jan 30 19:23:23 crc kubenswrapper[4782]: I0130 19:23:23.405278 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:23 crc kubenswrapper[4782]: I0130 19:23:23.406051 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:23 crc kubenswrapper[4782]: I0130 19:23:23.552216 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:23 crc kubenswrapper[4782]: I0130 19:23:23.552333 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:24 crc kubenswrapper[4782]: I0130 19:23:24.470646 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-q2r6f" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="registry-server" probeResult="failure" output=< Jan 30 19:23:24 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:23:24 crc kubenswrapper[4782]: > Jan 30 19:23:24 crc kubenswrapper[4782]: I0130 19:23:24.623987 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r8jtt" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="registry-server" probeResult="failure" output=< Jan 30 19:23:24 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:23:24 crc kubenswrapper[4782]: > Jan 30 19:23:27 crc kubenswrapper[4782]: I0130 19:23:27.410602 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:23:27 crc kubenswrapper[4782]: E0130 19:23:27.411263 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:23:33 crc kubenswrapper[4782]: I0130 19:23:33.483364 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:33 crc kubenswrapper[4782]: I0130 19:23:33.584970 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:33 crc kubenswrapper[4782]: I0130 19:23:33.642343 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:33 crc kubenswrapper[4782]: I0130 19:23:33.723055 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:33 crc kubenswrapper[4782]: I0130 19:23:33.750363 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.029132 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q2r6f" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="registry-server" containerID="cri-o://3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9" gracePeriod=2 Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.542548 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.573814 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcg8\" (UniqueName: \"kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8\") pod \"c69fcf15-1591-4fa4-b259-245d7e1ab499\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.573956 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content\") pod \"c69fcf15-1591-4fa4-b259-245d7e1ab499\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.574012 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities\") pod \"c69fcf15-1591-4fa4-b259-245d7e1ab499\" (UID: \"c69fcf15-1591-4fa4-b259-245d7e1ab499\") " Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.574849 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities" (OuterVolumeSpecName: "utilities") pod "c69fcf15-1591-4fa4-b259-245d7e1ab499" (UID: "c69fcf15-1591-4fa4-b259-245d7e1ab499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.575172 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.591474 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8" (OuterVolumeSpecName: "kube-api-access-fgcg8") pod "c69fcf15-1591-4fa4-b259-245d7e1ab499" (UID: "c69fcf15-1591-4fa4-b259-245d7e1ab499"). InnerVolumeSpecName "kube-api-access-fgcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.626635 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c69fcf15-1591-4fa4-b259-245d7e1ab499" (UID: "c69fcf15-1591-4fa4-b259-245d7e1ab499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.677706 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcg8\" (UniqueName: \"kubernetes.io/projected/c69fcf15-1591-4fa4-b259-245d7e1ab499-kube-api-access-fgcg8\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.677745 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69fcf15-1591-4fa4-b259-245d7e1ab499-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.947065 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:35 crc kubenswrapper[4782]: I0130 19:23:35.947380 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8jtt" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="registry-server" containerID="cri-o://5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba" gracePeriod=2 Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.040155 4782 generic.go:334] "Generic (PLEG): container finished" podID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerID="3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9" exitCode=0 Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.040200 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerDied","Data":"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9"} Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.040222 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q2r6f" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.040240 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q2r6f" event={"ID":"c69fcf15-1591-4fa4-b259-245d7e1ab499","Type":"ContainerDied","Data":"a874d90e7135cde6e5f11b09441997730d9871f65fd3821e0e0d07ba47496f02"} Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.040252 4782 scope.go:117] "RemoveContainer" containerID="3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.129840 4782 scope.go:117] "RemoveContainer" containerID="28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.132790 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.146163 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q2r6f"] Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.156034 4782 scope.go:117] "RemoveContainer" containerID="75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.218357 4782 scope.go:117] "RemoveContainer" containerID="3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9" Jan 30 19:23:36 crc kubenswrapper[4782]: E0130 19:23:36.218970 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9\": container with ID starting with 3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9 not found: ID does not exist" containerID="3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.219034 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9"} err="failed to get container status \"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9\": rpc error: code = NotFound desc = could not find container \"3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9\": container with ID starting with 3a97ab936043ece3590a1137ae2aa68b0915943d66740a21d479062a7d7d60d9 not found: ID does not exist" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.219068 4782 scope.go:117] "RemoveContainer" containerID="28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee" Jan 30 19:23:36 crc kubenswrapper[4782]: E0130 19:23:36.219462 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee\": container with ID starting with 28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee not found: ID does not exist" containerID="28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.219488 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee"} err="failed to get container status \"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee\": rpc error: code = NotFound desc = could not find container \"28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee\": container with ID starting with 28b64de8c9204e942017ec201cae7678e18950919b1702e9b51d263e9a7703ee not found: ID does not exist" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.219505 4782 scope.go:117] "RemoveContainer" containerID="75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184" Jan 30 19:23:36 crc kubenswrapper[4782]: E0130 19:23:36.219865 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184\": container with ID starting with 75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184 not found: ID does not exist" containerID="75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.219896 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184"} err="failed to get container status \"75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184\": rpc error: code = NotFound desc = could not find container \"75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184\": container with ID starting with 75a3657b5c3b1c2ea7e67c6c37f6145325b3df0d24d6e70c1a92c8c03f5b8184 not found: ID does not exist" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.421807 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" path="/var/lib/kubelet/pods/c69fcf15-1591-4fa4-b259-245d7e1ab499/volumes" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.471697 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.495069 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content\") pod \"0f960e49-7dc7-4adb-bb42-cba03b032522\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.495166 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgczt\" (UniqueName: \"kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt\") pod \"0f960e49-7dc7-4adb-bb42-cba03b032522\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.495298 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities\") pod \"0f960e49-7dc7-4adb-bb42-cba03b032522\" (UID: \"0f960e49-7dc7-4adb-bb42-cba03b032522\") " Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.496971 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities" (OuterVolumeSpecName: "utilities") pod "0f960e49-7dc7-4adb-bb42-cba03b032522" (UID: "0f960e49-7dc7-4adb-bb42-cba03b032522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.504543 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt" (OuterVolumeSpecName: "kube-api-access-sgczt") pod "0f960e49-7dc7-4adb-bb42-cba03b032522" (UID: "0f960e49-7dc7-4adb-bb42-cba03b032522"). InnerVolumeSpecName "kube-api-access-sgczt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.575974 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f960e49-7dc7-4adb-bb42-cba03b032522" (UID: "0f960e49-7dc7-4adb-bb42-cba03b032522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.597930 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.597955 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgczt\" (UniqueName: \"kubernetes.io/projected/0f960e49-7dc7-4adb-bb42-cba03b032522-kube-api-access-sgczt\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:36 crc kubenswrapper[4782]: I0130 19:23:36.597968 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f960e49-7dc7-4adb-bb42-cba03b032522-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.053894 4782 generic.go:334] "Generic (PLEG): container finished" podID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerID="5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba" exitCode=0 Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.054276 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerDied","Data":"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba"} Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.055573 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8jtt" event={"ID":"0f960e49-7dc7-4adb-bb42-cba03b032522","Type":"ContainerDied","Data":"f23bdf516b2c1f70351987fe05eff1fb0c77b3a30287a9b2f7c9eaed584dbcb1"} Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.055658 4782 scope.go:117] "RemoveContainer" containerID="5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.054397 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8jtt" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.088684 4782 scope.go:117] "RemoveContainer" containerID="4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.109390 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.121022 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8jtt"] Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.125739 4782 scope.go:117] "RemoveContainer" containerID="3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.150614 4782 scope.go:117] "RemoveContainer" containerID="5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba" Jan 30 19:23:37 crc kubenswrapper[4782]: E0130 19:23:37.152668 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba\": container with ID starting with 5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba not found: ID does not exist" containerID="5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.152754 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba"} err="failed to get container status \"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba\": rpc error: code = NotFound desc = could not find container \"5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba\": container with ID starting with 5be5eb213cce13449c51312aeccc382696d40cb220962f0d922e7d79cb5bb7ba not found: ID does not exist" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.152825 4782 scope.go:117] "RemoveContainer" containerID="4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86" Jan 30 19:23:37 crc kubenswrapper[4782]: E0130 19:23:37.153376 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86\": container with ID starting with 4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86 not found: ID does not exist" containerID="4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.153427 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86"} err="failed to get container status \"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86\": rpc error: code = NotFound desc = could not find container \"4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86\": container with ID starting with 4fe9e57d25c7069b7d794197f671dcc631c7fdcef98b00481a53a69249cb2a86 not found: ID does not exist" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.153447 4782 scope.go:117] "RemoveContainer" containerID="3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42" Jan 30 19:23:37 crc kubenswrapper[4782]: E0130 19:23:37.153890 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42\": container with ID starting with 3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42 not found: ID does not exist" containerID="3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42" Jan 30 19:23:37 crc kubenswrapper[4782]: I0130 19:23:37.153976 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42"} err="failed to get container status \"3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42\": rpc error: code = NotFound desc = could not find container \"3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42\": container with ID starting with 3712d1aa98608ba792df0eb5c051a0b4ccae68b1941a6fc7772936603cc4ac42 not found: ID does not exist" Jan 30 19:23:38 crc kubenswrapper[4782]: I0130 19:23:38.423128 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" path="/var/lib/kubelet/pods/0f960e49-7dc7-4adb-bb42-cba03b032522/volumes" Jan 30 19:23:40 crc kubenswrapper[4782]: I0130 19:23:40.411295 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:23:40 crc kubenswrapper[4782]: E0130 19:23:40.412168 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:23:55 crc kubenswrapper[4782]: I0130 19:23:55.410905 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:23:55 crc kubenswrapper[4782]: E0130 19:23:55.412113 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:24:07 crc kubenswrapper[4782]: I0130 19:24:07.411572 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:24:07 crc kubenswrapper[4782]: E0130 19:24:07.412653 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:24:19 crc kubenswrapper[4782]: I0130 19:24:19.411814 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:24:19 crc kubenswrapper[4782]: E0130 19:24:19.412837 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:24:33 crc kubenswrapper[4782]: I0130 19:24:33.410925 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:24:33 crc kubenswrapper[4782]: E0130 19:24:33.411841 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:24:45 crc kubenswrapper[4782]: I0130 19:24:45.410969 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:24:45 crc kubenswrapper[4782]: E0130 19:24:45.411885 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:24:58 crc kubenswrapper[4782]: I0130 19:24:58.412015 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:24:58 crc kubenswrapper[4782]: E0130 19:24:58.414097 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:25:10 crc kubenswrapper[4782]: I0130 19:25:10.414259 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:25:10 crc kubenswrapper[4782]: E0130 19:25:10.415199 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:25:25 crc kubenswrapper[4782]: I0130 19:25:25.411166 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:25:25 crc kubenswrapper[4782]: E0130 19:25:25.412442 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:25:36 crc kubenswrapper[4782]: I0130 19:25:36.415926 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:25:36 crc kubenswrapper[4782]: E0130 19:25:36.417661 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:25:47 crc kubenswrapper[4782]: I0130 19:25:47.411755 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:25:47 crc kubenswrapper[4782]: E0130 19:25:47.413034 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:26:02 crc kubenswrapper[4782]: I0130 19:26:02.410747 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:26:02 crc kubenswrapper[4782]: E0130 19:26:02.411551 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:26:13 crc kubenswrapper[4782]: I0130 19:26:13.411996 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:26:13 crc kubenswrapper[4782]: E0130 19:26:13.412952 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:26:25 crc kubenswrapper[4782]: I0130 19:26:25.410974 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:26:25 crc kubenswrapper[4782]: E0130 19:26:25.412032 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:26:37 crc kubenswrapper[4782]: I0130 19:26:37.411679 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:26:37 crc kubenswrapper[4782]: E0130 19:26:37.412540 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:26:48 crc kubenswrapper[4782]: I0130 19:26:48.411663 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:26:48 crc kubenswrapper[4782]: E0130 19:26:48.412646 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:27:02 crc kubenswrapper[4782]: I0130 19:27:02.410949 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:27:02 crc kubenswrapper[4782]: E0130 19:27:02.411731 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:27:17 crc kubenswrapper[4782]: I0130 19:27:17.410928 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:27:17 crc kubenswrapper[4782]: E0130 19:27:17.411598 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:27:32 crc kubenswrapper[4782]: I0130 19:27:32.412050 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:27:33 crc kubenswrapper[4782]: I0130 19:27:33.687324 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e"} Jan 30 19:27:52 crc kubenswrapper[4782]: E0130 19:27:52.578456 4782 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:57836->38.102.83.212:36463: write tcp 38.102.83.212:57836->38.102.83.212:36463: write: broken pipe Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.660824 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.663400 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="extract-content" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.663527 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="extract-content" Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.663619 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="extract-content" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.663698 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="extract-content" Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.663785 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.663858 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.663943 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.664015 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.664094 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="extract-utilities" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.664175 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="extract-utilities" Jan 30 19:29:14 crc kubenswrapper[4782]: E0130 19:29:14.664293 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="extract-utilities" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.664378 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="extract-utilities" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.664727 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f960e49-7dc7-4adb-bb42-cba03b032522" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.664816 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69fcf15-1591-4fa4-b259-245d7e1ab499" containerName="registry-server" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.666892 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.683741 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.770107 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.770199 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567sr\" (UniqueName: \"kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.770359 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.873014 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567sr\" (UniqueName: \"kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.873325 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.873486 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.873835 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.873956 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.904976 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567sr\" (UniqueName: \"kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr\") pod \"redhat-operators-jvqvf\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:14 crc kubenswrapper[4782]: I0130 19:29:14.999143 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:15 crc kubenswrapper[4782]: I0130 19:29:15.502422 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:15 crc kubenswrapper[4782]: I0130 19:29:15.842858 4782 generic.go:334] "Generic (PLEG): container finished" podID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerID="a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6" exitCode=0 Jan 30 19:29:15 crc kubenswrapper[4782]: I0130 19:29:15.842895 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerDied","Data":"a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6"} Jan 30 19:29:15 crc kubenswrapper[4782]: I0130 19:29:15.842919 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerStarted","Data":"a09485a8b6ad4a7dccfda8c99ce6e446221ab32c12538f2f0a1b109a762b99ca"} Jan 30 19:29:15 crc kubenswrapper[4782]: I0130 19:29:15.844642 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:29:17 crc kubenswrapper[4782]: I0130 19:29:17.876415 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerStarted","Data":"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944"} Jan 30 19:29:22 crc kubenswrapper[4782]: I0130 19:29:22.938873 4782 generic.go:334] "Generic (PLEG): container finished" podID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerID="52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944" exitCode=0 Jan 30 19:29:22 crc kubenswrapper[4782]: I0130 19:29:22.938971 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerDied","Data":"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944"} Jan 30 19:29:23 crc kubenswrapper[4782]: I0130 19:29:23.954034 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerStarted","Data":"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3"} Jan 30 19:29:23 crc kubenswrapper[4782]: I0130 19:29:23.991280 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvqvf" podStartSLOduration=2.488830325 podStartE2EDuration="9.991215386s" podCreationTimestamp="2026-01-30 19:29:14 +0000 UTC" firstStartedPulling="2026-01-30 19:29:15.844392409 +0000 UTC m=+3532.112770434" lastFinishedPulling="2026-01-30 19:29:23.34677746 +0000 UTC m=+3539.615155495" observedRunningTime="2026-01-30 19:29:23.97525463 +0000 UTC m=+3540.243632685" watchObservedRunningTime="2026-01-30 19:29:23.991215386 +0000 UTC m=+3540.259593451" Jan 30 19:29:25 crc kubenswrapper[4782]: I0130 19:29:24.999988 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:25 crc kubenswrapper[4782]: I0130 19:29:25.000548 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:26 crc kubenswrapper[4782]: I0130 19:29:26.067456 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvqvf" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" probeResult="failure" output=< Jan 30 19:29:26 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:29:26 crc kubenswrapper[4782]: > Jan 30 19:29:36 crc kubenswrapper[4782]: I0130 19:29:36.064828 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvqvf" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" probeResult="failure" output=< Jan 30 19:29:36 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:29:36 crc kubenswrapper[4782]: > Jan 30 19:29:45 crc kubenswrapper[4782]: I0130 19:29:45.077634 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:45 crc kubenswrapper[4782]: I0130 19:29:45.148018 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:45 crc kubenswrapper[4782]: I0130 19:29:45.873496 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.218534 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvqvf" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" containerID="cri-o://1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3" gracePeriod=2 Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.761604 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.908098 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content\") pod \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.908298 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities\") pod \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.908406 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-567sr\" (UniqueName: \"kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr\") pod \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\" (UID: \"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b\") " Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.909189 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities" (OuterVolumeSpecName: "utilities") pod "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" (UID: "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:29:46 crc kubenswrapper[4782]: I0130 19:29:46.916605 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr" (OuterVolumeSpecName: "kube-api-access-567sr") pod "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" (UID: "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b"). InnerVolumeSpecName "kube-api-access-567sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.011512 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.011536 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-567sr\" (UniqueName: \"kubernetes.io/projected/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-kube-api-access-567sr\") on node \"crc\" DevicePath \"\"" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.024978 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" (UID: "d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.113998 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.235521 4782 generic.go:334] "Generic (PLEG): container finished" podID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerID="1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3" exitCode=0 Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.235562 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerDied","Data":"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3"} Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.235589 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvqvf" event={"ID":"d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b","Type":"ContainerDied","Data":"a09485a8b6ad4a7dccfda8c99ce6e446221ab32c12538f2f0a1b109a762b99ca"} Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.235607 4782 scope.go:117] "RemoveContainer" containerID="1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.235663 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvqvf" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.275603 4782 scope.go:117] "RemoveContainer" containerID="52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.282200 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.293043 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvqvf"] Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.297546 4782 scope.go:117] "RemoveContainer" containerID="a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.341065 4782 scope.go:117] "RemoveContainer" containerID="1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3" Jan 30 19:29:47 crc kubenswrapper[4782]: E0130 19:29:47.341645 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3\": container with ID starting with 1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3 not found: ID does not exist" containerID="1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.341711 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3"} err="failed to get container status \"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3\": rpc error: code = NotFound desc = could not find container \"1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3\": container with ID starting with 1789e6e9174c08c8601b0d44e275dac912e4609db2c80fdd7f4e63a07a90e8e3 not found: ID does not exist" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.341754 4782 scope.go:117] "RemoveContainer" containerID="52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944" Jan 30 19:29:47 crc kubenswrapper[4782]: E0130 19:29:47.342573 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944\": container with ID starting with 52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944 not found: ID does not exist" containerID="52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.342626 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944"} err="failed to get container status \"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944\": rpc error: code = NotFound desc = could not find container \"52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944\": container with ID starting with 52cd0e524c178ed8fb80a8e6407059ac4b7df9281ab2a3603f43c75994b1f944 not found: ID does not exist" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.342665 4782 scope.go:117] "RemoveContainer" containerID="a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6" Jan 30 19:29:47 crc kubenswrapper[4782]: E0130 19:29:47.342960 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6\": container with ID starting with a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6 not found: ID does not exist" containerID="a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6" Jan 30 19:29:47 crc kubenswrapper[4782]: I0130 19:29:47.343001 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6"} err="failed to get container status \"a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6\": rpc error: code = NotFound desc = could not find container \"a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6\": container with ID starting with a1010bc2e0614c409d3fd8e76ec2fe51dca9e4c0dbae67cae97d1b9e940864a6 not found: ID does not exist" Jan 30 19:29:48 crc kubenswrapper[4782]: I0130 19:29:48.422786 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" path="/var/lib/kubelet/pods/d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b/volumes" Jan 30 19:29:49 crc kubenswrapper[4782]: I0130 19:29:49.793071 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:29:49 crc kubenswrapper[4782]: I0130 19:29:49.793893 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.223200 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5"] Jan 30 19:30:00 crc kubenswrapper[4782]: E0130 19:30:00.225382 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="extract-content" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.225482 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="extract-content" Jan 30 19:30:00 crc kubenswrapper[4782]: E0130 19:30:00.225565 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="extract-utilities" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.225633 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="extract-utilities" Jan 30 19:30:00 crc kubenswrapper[4782]: E0130 19:30:00.225732 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.225789 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.226116 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="d173e2bf-8c95-4fe4-b3ad-ba53457c7e9b" containerName="registry-server" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.232149 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.237472 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.237892 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.245064 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5"] Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.334356 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5w79\" (UniqueName: \"kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.334742 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.334857 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.436556 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.436617 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.436735 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5w79\" (UniqueName: \"kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.438211 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.454840 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.474940 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5w79\" (UniqueName: \"kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79\") pod \"collect-profiles-29496690-7cpl5\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:00 crc kubenswrapper[4782]: I0130 19:30:00.554270 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:01 crc kubenswrapper[4782]: I0130 19:30:01.008006 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5"] Jan 30 19:30:01 crc kubenswrapper[4782]: I0130 19:30:01.389978 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" event={"ID":"0b24fd70-c832-4124-b936-e73e54e41b38","Type":"ContainerStarted","Data":"ceb68721229ea238cf40a1500fff86d172f7bf2a6c5ee94080fbca1caf3d1f11"} Jan 30 19:30:01 crc kubenswrapper[4782]: I0130 19:30:01.390026 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" event={"ID":"0b24fd70-c832-4124-b936-e73e54e41b38","Type":"ContainerStarted","Data":"00b23af33acb4c7abd679d580870b09ec1cfb0c0fc1ddefe2ce4c66434176a29"} Jan 30 19:30:01 crc kubenswrapper[4782]: I0130 19:30:01.413460 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" podStartSLOduration=1.413437401 podStartE2EDuration="1.413437401s" podCreationTimestamp="2026-01-30 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 19:30:01.412078857 +0000 UTC m=+3577.680456912" watchObservedRunningTime="2026-01-30 19:30:01.413437401 +0000 UTC m=+3577.681815416" Jan 30 19:30:02 crc kubenswrapper[4782]: I0130 19:30:02.402037 4782 generic.go:334] "Generic (PLEG): container finished" podID="0b24fd70-c832-4124-b936-e73e54e41b38" containerID="ceb68721229ea238cf40a1500fff86d172f7bf2a6c5ee94080fbca1caf3d1f11" exitCode=0 Jan 30 19:30:02 crc kubenswrapper[4782]: I0130 19:30:02.402353 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" event={"ID":"0b24fd70-c832-4124-b936-e73e54e41b38","Type":"ContainerDied","Data":"ceb68721229ea238cf40a1500fff86d172f7bf2a6c5ee94080fbca1caf3d1f11"} Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.837661 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.912237 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume\") pod \"0b24fd70-c832-4124-b936-e73e54e41b38\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.912751 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5w79\" (UniqueName: \"kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79\") pod \"0b24fd70-c832-4124-b936-e73e54e41b38\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.912918 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume\") pod \"0b24fd70-c832-4124-b936-e73e54e41b38\" (UID: \"0b24fd70-c832-4124-b936-e73e54e41b38\") " Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.914654 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b24fd70-c832-4124-b936-e73e54e41b38" (UID: "0b24fd70-c832-4124-b936-e73e54e41b38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.921121 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b24fd70-c832-4124-b936-e73e54e41b38" (UID: "0b24fd70-c832-4124-b936-e73e54e41b38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:30:03 crc kubenswrapper[4782]: I0130 19:30:03.934931 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79" (OuterVolumeSpecName: "kube-api-access-p5w79") pod "0b24fd70-c832-4124-b936-e73e54e41b38" (UID: "0b24fd70-c832-4124-b936-e73e54e41b38"). InnerVolumeSpecName "kube-api-access-p5w79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.014965 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5w79\" (UniqueName: \"kubernetes.io/projected/0b24fd70-c832-4124-b936-e73e54e41b38-kube-api-access-p5w79\") on node \"crc\" DevicePath \"\"" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.015002 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b24fd70-c832-4124-b936-e73e54e41b38-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.015015 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b24fd70-c832-4124-b936-e73e54e41b38-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.428168 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" event={"ID":"0b24fd70-c832-4124-b936-e73e54e41b38","Type":"ContainerDied","Data":"00b23af33acb4c7abd679d580870b09ec1cfb0c0fc1ddefe2ce4c66434176a29"} Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.428221 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b23af33acb4c7abd679d580870b09ec1cfb0c0fc1ddefe2ce4c66434176a29" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.428288 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5" Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.517969 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl"] Jan 30 19:30:04 crc kubenswrapper[4782]: I0130 19:30:04.532185 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496645-474hl"] Jan 30 19:30:06 crc kubenswrapper[4782]: I0130 19:30:06.437771 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b16342-989f-4f2b-8eef-1e638aeb7858" path="/var/lib/kubelet/pods/c4b16342-989f-4f2b-8eef-1e638aeb7858/volumes" Jan 30 19:30:19 crc kubenswrapper[4782]: I0130 19:30:19.792488 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:30:19 crc kubenswrapper[4782]: I0130 19:30:19.793362 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:30:35 crc kubenswrapper[4782]: I0130 19:30:35.727270 4782 scope.go:117] "RemoveContainer" containerID="3de6827731405bb2f4c5bcef02cff0f66ee60f560a6480256e0cf7d894c02836" Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.793445 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.794329 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.794429 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.795543 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.795632 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e" gracePeriod=600 Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.953312 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e" exitCode=0 Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.953401 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e"} Jan 30 19:30:49 crc kubenswrapper[4782]: I0130 19:30:49.953884 4782 scope.go:117] "RemoveContainer" containerID="80d2521c43c94fb4ef498558f27e7d9a9c8202bc2968e173cb1c40bc26479e70" Jan 30 19:30:50 crc kubenswrapper[4782]: I0130 19:30:50.978715 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae"} Jan 30 19:33:19 crc kubenswrapper[4782]: I0130 19:33:19.818395 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:33:19 crc kubenswrapper[4782]: I0130 19:33:19.818905 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.088085 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:42 crc kubenswrapper[4782]: E0130 19:33:42.089795 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b24fd70-c832-4124-b936-e73e54e41b38" containerName="collect-profiles" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.089863 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b24fd70-c832-4124-b936-e73e54e41b38" containerName="collect-profiles" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.090115 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b24fd70-c832-4124-b936-e73e54e41b38" containerName="collect-profiles" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.091540 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.104666 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.255705 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.255760 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.255799 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkmx\" (UniqueName: \"kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.358168 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czkmx\" (UniqueName: \"kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.358368 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.358398 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.358843 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.358889 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.381012 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkmx\" (UniqueName: \"kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx\") pod \"community-operators-qnx8m\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.406930 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:42 crc kubenswrapper[4782]: I0130 19:33:42.911009 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:43 crc kubenswrapper[4782]: I0130 19:33:43.835032 4782 generic.go:334] "Generic (PLEG): container finished" podID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerID="feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3" exitCode=0 Jan 30 19:33:43 crc kubenswrapper[4782]: I0130 19:33:43.835092 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerDied","Data":"feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3"} Jan 30 19:33:43 crc kubenswrapper[4782]: I0130 19:33:43.835503 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerStarted","Data":"c34456ab8421581bb4af71b3ec451eaf73b06ea34e1411dd695f0963c2ffee63"} Jan 30 19:33:45 crc kubenswrapper[4782]: I0130 19:33:45.864563 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerStarted","Data":"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888"} Jan 30 19:33:46 crc kubenswrapper[4782]: I0130 19:33:46.876138 4782 generic.go:334] "Generic (PLEG): container finished" podID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerID="d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888" exitCode=0 Jan 30 19:33:46 crc kubenswrapper[4782]: I0130 19:33:46.876186 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerDied","Data":"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888"} Jan 30 19:33:47 crc kubenswrapper[4782]: I0130 19:33:47.887444 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerStarted","Data":"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d"} Jan 30 19:33:49 crc kubenswrapper[4782]: I0130 19:33:49.792868 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:33:49 crc kubenswrapper[4782]: I0130 19:33:49.793345 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:33:52 crc kubenswrapper[4782]: I0130 19:33:52.408124 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:52 crc kubenswrapper[4782]: I0130 19:33:52.408742 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:52 crc kubenswrapper[4782]: I0130 19:33:52.465645 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:52 crc kubenswrapper[4782]: I0130 19:33:52.495685 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qnx8m" podStartSLOduration=7.001299587 podStartE2EDuration="10.495662257s" podCreationTimestamp="2026-01-30 19:33:42 +0000 UTC" firstStartedPulling="2026-01-30 19:33:43.838082965 +0000 UTC m=+3800.106460990" lastFinishedPulling="2026-01-30 19:33:47.332445625 +0000 UTC m=+3803.600823660" observedRunningTime="2026-01-30 19:33:47.905876027 +0000 UTC m=+3804.174254052" watchObservedRunningTime="2026-01-30 19:33:52.495662257 +0000 UTC m=+3808.764040312" Jan 30 19:33:53 crc kubenswrapper[4782]: I0130 19:33:53.000412 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:53 crc kubenswrapper[4782]: I0130 19:33:53.066466 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:54 crc kubenswrapper[4782]: I0130 19:33:54.976554 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qnx8m" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="registry-server" containerID="cri-o://b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d" gracePeriod=2 Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.532483 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.568216 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content\") pod \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.568431 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czkmx\" (UniqueName: \"kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx\") pod \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.568538 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities\") pod \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\" (UID: \"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8\") " Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.570004 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities" (OuterVolumeSpecName: "utilities") pod "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" (UID: "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.581662 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx" (OuterVolumeSpecName: "kube-api-access-czkmx") pod "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" (UID: "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8"). InnerVolumeSpecName "kube-api-access-czkmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.648528 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" (UID: "7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.671502 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.671556 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czkmx\" (UniqueName: \"kubernetes.io/projected/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-kube-api-access-czkmx\") on node \"crc\" DevicePath \"\"" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.671576 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.990671 4782 generic.go:334] "Generic (PLEG): container finished" podID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerID="b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d" exitCode=0 Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.991003 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerDied","Data":"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d"} Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.991029 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnx8m" event={"ID":"7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8","Type":"ContainerDied","Data":"c34456ab8421581bb4af71b3ec451eaf73b06ea34e1411dd695f0963c2ffee63"} Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.991046 4782 scope.go:117] "RemoveContainer" containerID="b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d" Jan 30 19:33:55 crc kubenswrapper[4782]: I0130 19:33:55.991177 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnx8m" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.031266 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.040200 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qnx8m"] Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.042020 4782 scope.go:117] "RemoveContainer" containerID="d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.076886 4782 scope.go:117] "RemoveContainer" containerID="feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.117779 4782 scope.go:117] "RemoveContainer" containerID="b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d" Jan 30 19:33:56 crc kubenswrapper[4782]: E0130 19:33:56.118602 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d\": container with ID starting with b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d not found: ID does not exist" containerID="b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.118641 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d"} err="failed to get container status \"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d\": rpc error: code = NotFound desc = could not find container \"b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d\": container with ID starting with b87e6104524d683d31dfeed84c09f9b999d5d05f6766e2621c37ac8571edc50d not found: ID does not exist" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.118689 4782 scope.go:117] "RemoveContainer" containerID="d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888" Jan 30 19:33:56 crc kubenswrapper[4782]: E0130 19:33:56.119983 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888\": container with ID starting with d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888 not found: ID does not exist" containerID="d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.120003 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888"} err="failed to get container status \"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888\": rpc error: code = NotFound desc = could not find container \"d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888\": container with ID starting with d3a82a7250feec9c77fef1809735a976e25ccca1e124a1285fb908fc8d8f5888 not found: ID does not exist" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.120018 4782 scope.go:117] "RemoveContainer" containerID="feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3" Jan 30 19:33:56 crc kubenswrapper[4782]: E0130 19:33:56.120561 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3\": container with ID starting with feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3 not found: ID does not exist" containerID="feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.120583 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3"} err="failed to get container status \"feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3\": rpc error: code = NotFound desc = could not find container \"feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3\": container with ID starting with feaf0d0874009e1ea8aedaf53fee6f17d804ed096bfff3326e3a6bbcb0b96ad3 not found: ID does not exist" Jan 30 19:33:56 crc kubenswrapper[4782]: I0130 19:33:56.423643 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" path="/var/lib/kubelet/pods/7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8/volumes" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.403448 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:06 crc kubenswrapper[4782]: E0130 19:34:06.404785 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="registry-server" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.404811 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="registry-server" Jan 30 19:34:06 crc kubenswrapper[4782]: E0130 19:34:06.404872 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="extract-utilities" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.404886 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="extract-utilities" Jan 30 19:34:06 crc kubenswrapper[4782]: E0130 19:34:06.404903 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="extract-content" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.404915 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="extract-content" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.405585 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7cace6-a6ce-4f18-bb5b-e3ea63a586d8" containerName="registry-server" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.410571 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.431201 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.439607 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prv67\" (UniqueName: \"kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.439900 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.440052 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.542590 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prv67\" (UniqueName: \"kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.542638 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.542659 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.543118 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.543366 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.568414 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prv67\" (UniqueName: \"kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67\") pod \"redhat-marketplace-98tw6\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:06 crc kubenswrapper[4782]: I0130 19:34:06.740822 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:07 crc kubenswrapper[4782]: I0130 19:34:07.295645 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:08 crc kubenswrapper[4782]: I0130 19:34:08.119641 4782 generic.go:334] "Generic (PLEG): container finished" podID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerID="119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9" exitCode=0 Jan 30 19:34:08 crc kubenswrapper[4782]: I0130 19:34:08.119867 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerDied","Data":"119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9"} Jan 30 19:34:08 crc kubenswrapper[4782]: I0130 19:34:08.120149 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerStarted","Data":"c6369c411c325a20016119917fe0a0008ae79b4a6c760744ac8bd173007ec121"} Jan 30 19:34:09 crc kubenswrapper[4782]: I0130 19:34:09.132586 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerStarted","Data":"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090"} Jan 30 19:34:10 crc kubenswrapper[4782]: I0130 19:34:10.144603 4782 generic.go:334] "Generic (PLEG): container finished" podID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerID="f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090" exitCode=0 Jan 30 19:34:10 crc kubenswrapper[4782]: I0130 19:34:10.146258 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerDied","Data":"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090"} Jan 30 19:34:11 crc kubenswrapper[4782]: I0130 19:34:11.163653 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerStarted","Data":"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead"} Jan 30 19:34:16 crc kubenswrapper[4782]: I0130 19:34:16.740946 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:16 crc kubenswrapper[4782]: I0130 19:34:16.742697 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:16 crc kubenswrapper[4782]: I0130 19:34:16.825176 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:16 crc kubenswrapper[4782]: I0130 19:34:16.845365 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-98tw6" podStartSLOduration=8.391684788 podStartE2EDuration="10.845346705s" podCreationTimestamp="2026-01-30 19:34:06 +0000 UTC" firstStartedPulling="2026-01-30 19:34:08.121853481 +0000 UTC m=+3824.390231506" lastFinishedPulling="2026-01-30 19:34:10.575515398 +0000 UTC m=+3826.843893423" observedRunningTime="2026-01-30 19:34:11.194271651 +0000 UTC m=+3827.462649716" watchObservedRunningTime="2026-01-30 19:34:16.845346705 +0000 UTC m=+3833.113724720" Jan 30 19:34:17 crc kubenswrapper[4782]: I0130 19:34:17.307701 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:17 crc kubenswrapper[4782]: I0130 19:34:17.359819 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.269768 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-98tw6" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="registry-server" containerID="cri-o://f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead" gracePeriod=2 Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.793301 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.793714 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.793778 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.794721 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.794862 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" gracePeriod=600 Jan 30 19:34:19 crc kubenswrapper[4782]: E0130 19:34:19.917280 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:34:19 crc kubenswrapper[4782]: I0130 19:34:19.998865 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.092579 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content\") pod \"06b3f902-f581-4337-b0fa-a1caa32364d8\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.092800 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities\") pod \"06b3f902-f581-4337-b0fa-a1caa32364d8\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.092883 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prv67\" (UniqueName: \"kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67\") pod \"06b3f902-f581-4337-b0fa-a1caa32364d8\" (UID: \"06b3f902-f581-4337-b0fa-a1caa32364d8\") " Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.094908 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities" (OuterVolumeSpecName: "utilities") pod "06b3f902-f581-4337-b0fa-a1caa32364d8" (UID: "06b3f902-f581-4337-b0fa-a1caa32364d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.102333 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67" (OuterVolumeSpecName: "kube-api-access-prv67") pod "06b3f902-f581-4337-b0fa-a1caa32364d8" (UID: "06b3f902-f581-4337-b0fa-a1caa32364d8"). InnerVolumeSpecName "kube-api-access-prv67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.139854 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06b3f902-f581-4337-b0fa-a1caa32364d8" (UID: "06b3f902-f581-4337-b0fa-a1caa32364d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.195648 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.195680 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06b3f902-f581-4337-b0fa-a1caa32364d8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.195690 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prv67\" (UniqueName: \"kubernetes.io/projected/06b3f902-f581-4337-b0fa-a1caa32364d8-kube-api-access-prv67\") on node \"crc\" DevicePath \"\"" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.281257 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" exitCode=0 Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.281321 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae"} Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.281353 4782 scope.go:117] "RemoveContainer" containerID="234e22ebdde9ac053f0b054b33ad4dc760afea0ecfdc2e6df912b9df0007602e" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.281904 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:34:20 crc kubenswrapper[4782]: E0130 19:34:20.282134 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.286749 4782 generic.go:334] "Generic (PLEG): container finished" podID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerID="f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead" exitCode=0 Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.286791 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerDied","Data":"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead"} Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.286815 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-98tw6" event={"ID":"06b3f902-f581-4337-b0fa-a1caa32364d8","Type":"ContainerDied","Data":"c6369c411c325a20016119917fe0a0008ae79b4a6c760744ac8bd173007ec121"} Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.286880 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-98tw6" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.330185 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.337888 4782 scope.go:117] "RemoveContainer" containerID="f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.341097 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-98tw6"] Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.374391 4782 scope.go:117] "RemoveContainer" containerID="f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.409899 4782 scope.go:117] "RemoveContainer" containerID="119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.435277 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" path="/var/lib/kubelet/pods/06b3f902-f581-4337-b0fa-a1caa32364d8/volumes" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.449943 4782 scope.go:117] "RemoveContainer" containerID="f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead" Jan 30 19:34:20 crc kubenswrapper[4782]: E0130 19:34:20.451627 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead\": container with ID starting with f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead not found: ID does not exist" containerID="f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.451694 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead"} err="failed to get container status \"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead\": rpc error: code = NotFound desc = could not find container \"f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead\": container with ID starting with f692b5c739362f51fa67c0ceb5a6b664dd8fe00c6208138a3d037a4edc8b9ead not found: ID does not exist" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.451727 4782 scope.go:117] "RemoveContainer" containerID="f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090" Jan 30 19:34:20 crc kubenswrapper[4782]: E0130 19:34:20.453192 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090\": container with ID starting with f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090 not found: ID does not exist" containerID="f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.453247 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090"} err="failed to get container status \"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090\": rpc error: code = NotFound desc = could not find container \"f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090\": container with ID starting with f6f5786ca2198fe8776c537b056912ad1c46bb3c05ee8372c68ec2c588316090 not found: ID does not exist" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.453270 4782 scope.go:117] "RemoveContainer" containerID="119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9" Jan 30 19:34:20 crc kubenswrapper[4782]: E0130 19:34:20.453498 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9\": container with ID starting with 119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9 not found: ID does not exist" containerID="119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9" Jan 30 19:34:20 crc kubenswrapper[4782]: I0130 19:34:20.453525 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9"} err="failed to get container status \"119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9\": rpc error: code = NotFound desc = could not find container \"119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9\": container with ID starting with 119aadbb9effd4d026a35c20b43d218d8cc2640d105e9dcdbca4d1465027fbc9 not found: ID does not exist" Jan 30 19:34:33 crc kubenswrapper[4782]: I0130 19:34:33.410967 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:34:33 crc kubenswrapper[4782]: E0130 19:34:33.412451 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:34:46 crc kubenswrapper[4782]: I0130 19:34:46.422928 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:34:46 crc kubenswrapper[4782]: E0130 19:34:46.423971 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:34:58 crc kubenswrapper[4782]: I0130 19:34:58.422002 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:34:58 crc kubenswrapper[4782]: E0130 19:34:58.423714 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:35:13 crc kubenswrapper[4782]: I0130 19:35:13.411787 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:35:13 crc kubenswrapper[4782]: E0130 19:35:13.412905 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:35:28 crc kubenswrapper[4782]: I0130 19:35:28.412391 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:35:28 crc kubenswrapper[4782]: E0130 19:35:28.413336 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:35:40 crc kubenswrapper[4782]: I0130 19:35:40.441063 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:35:40 crc kubenswrapper[4782]: E0130 19:35:40.442370 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:35:51 crc kubenswrapper[4782]: I0130 19:35:51.412881 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:35:51 crc kubenswrapper[4782]: E0130 19:35:51.413935 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:03 crc kubenswrapper[4782]: I0130 19:36:03.411823 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:36:03 crc kubenswrapper[4782]: E0130 19:36:03.413012 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:16 crc kubenswrapper[4782]: I0130 19:36:16.449343 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:36:16 crc kubenswrapper[4782]: E0130 19:36:16.451516 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:27 crc kubenswrapper[4782]: I0130 19:36:27.411609 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:36:27 crc kubenswrapper[4782]: E0130 19:36:27.412621 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:40 crc kubenswrapper[4782]: I0130 19:36:40.411343 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:36:40 crc kubenswrapper[4782]: E0130 19:36:40.412623 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.298026 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:36:52 crc kubenswrapper[4782]: E0130 19:36:52.299099 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="extract-content" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.299118 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="extract-content" Jan 30 19:36:52 crc kubenswrapper[4782]: E0130 19:36:52.299136 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="extract-utilities" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.299144 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="extract-utilities" Jan 30 19:36:52 crc kubenswrapper[4782]: E0130 19:36:52.299171 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="registry-server" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.299179 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="registry-server" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.299445 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b3f902-f581-4337-b0fa-a1caa32364d8" containerName="registry-server" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.305638 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.321035 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.456945 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knclr\" (UniqueName: \"kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.457485 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.457834 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.559821 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.560145 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knclr\" (UniqueName: \"kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.560199 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.561049 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.561357 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.586788 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knclr\" (UniqueName: \"kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr\") pod \"certified-operators-zxzgp\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:52 crc kubenswrapper[4782]: I0130 19:36:52.636109 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.124013 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.410547 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:36:53 crc kubenswrapper[4782]: E0130 19:36:53.410956 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.614087 4782 generic.go:334] "Generic (PLEG): container finished" podID="03134986-d922-410f-a220-ef5f5e7529d5" containerID="86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf" exitCode=0 Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.614187 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerDied","Data":"86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf"} Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.614310 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerStarted","Data":"6a1c2870e6c922449aa2c685295a4c1fc08e9a7829c44d191c4d51f7566adf0c"} Jan 30 19:36:53 crc kubenswrapper[4782]: I0130 19:36:53.616524 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:36:54 crc kubenswrapper[4782]: I0130 19:36:54.629889 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerStarted","Data":"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405"} Jan 30 19:36:56 crc kubenswrapper[4782]: I0130 19:36:56.650897 4782 generic.go:334] "Generic (PLEG): container finished" podID="03134986-d922-410f-a220-ef5f5e7529d5" containerID="6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405" exitCode=0 Jan 30 19:36:56 crc kubenswrapper[4782]: I0130 19:36:56.650972 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerDied","Data":"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405"} Jan 30 19:36:57 crc kubenswrapper[4782]: I0130 19:36:57.664867 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerStarted","Data":"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42"} Jan 30 19:36:57 crc kubenswrapper[4782]: I0130 19:36:57.695812 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zxzgp" podStartSLOduration=2.260727677 podStartE2EDuration="5.695788035s" podCreationTimestamp="2026-01-30 19:36:52 +0000 UTC" firstStartedPulling="2026-01-30 19:36:53.61628659 +0000 UTC m=+3989.884664615" lastFinishedPulling="2026-01-30 19:36:57.051346918 +0000 UTC m=+3993.319724973" observedRunningTime="2026-01-30 19:36:57.686430673 +0000 UTC m=+3993.954808738" watchObservedRunningTime="2026-01-30 19:36:57.695788035 +0000 UTC m=+3993.964166090" Jan 30 19:37:02 crc kubenswrapper[4782]: I0130 19:37:02.637147 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:02 crc kubenswrapper[4782]: I0130 19:37:02.637864 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:02 crc kubenswrapper[4782]: I0130 19:37:02.712953 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:02 crc kubenswrapper[4782]: I0130 19:37:02.792113 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:02 crc kubenswrapper[4782]: I0130 19:37:02.979145 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:37:04 crc kubenswrapper[4782]: I0130 19:37:04.742103 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zxzgp" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="registry-server" containerID="cri-o://586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42" gracePeriod=2 Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.221868 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.271096 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knclr\" (UniqueName: \"kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr\") pod \"03134986-d922-410f-a220-ef5f5e7529d5\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.271217 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content\") pod \"03134986-d922-410f-a220-ef5f5e7529d5\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.271332 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities\") pod \"03134986-d922-410f-a220-ef5f5e7529d5\" (UID: \"03134986-d922-410f-a220-ef5f5e7529d5\") " Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.272294 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities" (OuterVolumeSpecName: "utilities") pod "03134986-d922-410f-a220-ef5f5e7529d5" (UID: "03134986-d922-410f-a220-ef5f5e7529d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.279448 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr" (OuterVolumeSpecName: "kube-api-access-knclr") pod "03134986-d922-410f-a220-ef5f5e7529d5" (UID: "03134986-d922-410f-a220-ef5f5e7529d5"). InnerVolumeSpecName "kube-api-access-knclr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.320116 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03134986-d922-410f-a220-ef5f5e7529d5" (UID: "03134986-d922-410f-a220-ef5f5e7529d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.374017 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.374055 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03134986-d922-410f-a220-ef5f5e7529d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.374065 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knclr\" (UniqueName: \"kubernetes.io/projected/03134986-d922-410f-a220-ef5f5e7529d5-kube-api-access-knclr\") on node \"crc\" DevicePath \"\"" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.754020 4782 generic.go:334] "Generic (PLEG): container finished" podID="03134986-d922-410f-a220-ef5f5e7529d5" containerID="586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42" exitCode=0 Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.754115 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxzgp" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.754166 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerDied","Data":"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42"} Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.754616 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxzgp" event={"ID":"03134986-d922-410f-a220-ef5f5e7529d5","Type":"ContainerDied","Data":"6a1c2870e6c922449aa2c685295a4c1fc08e9a7829c44d191c4d51f7566adf0c"} Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.754649 4782 scope.go:117] "RemoveContainer" containerID="586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.789899 4782 scope.go:117] "RemoveContainer" containerID="6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.802374 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.847574 4782 scope.go:117] "RemoveContainer" containerID="86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.848710 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zxzgp"] Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.922353 4782 scope.go:117] "RemoveContainer" containerID="586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42" Jan 30 19:37:05 crc kubenswrapper[4782]: E0130 19:37:05.930795 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42\": container with ID starting with 586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42 not found: ID does not exist" containerID="586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.930835 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42"} err="failed to get container status \"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42\": rpc error: code = NotFound desc = could not find container \"586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42\": container with ID starting with 586df2d4c1df86d971c9da085ec4a9b4387aac89290a8de18325116bef3dab42 not found: ID does not exist" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.930859 4782 scope.go:117] "RemoveContainer" containerID="6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405" Jan 30 19:37:05 crc kubenswrapper[4782]: E0130 19:37:05.938222 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405\": container with ID starting with 6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405 not found: ID does not exist" containerID="6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.938275 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405"} err="failed to get container status \"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405\": rpc error: code = NotFound desc = could not find container \"6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405\": container with ID starting with 6d80dcdb2156911f9c8c1c645b9062228fa78b1021f0fbc933e699cdea9ab405 not found: ID does not exist" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.938300 4782 scope.go:117] "RemoveContainer" containerID="86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf" Jan 30 19:37:05 crc kubenswrapper[4782]: E0130 19:37:05.941777 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf\": container with ID starting with 86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf not found: ID does not exist" containerID="86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf" Jan 30 19:37:05 crc kubenswrapper[4782]: I0130 19:37:05.941810 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf"} err="failed to get container status \"86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf\": rpc error: code = NotFound desc = could not find container \"86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf\": container with ID starting with 86d6a7f41279388b9a145c2725e815d74aa8c6e114b008a9a2409a56f4d68fbf not found: ID does not exist" Jan 30 19:37:06 crc kubenswrapper[4782]: I0130 19:37:06.427772 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03134986-d922-410f-a220-ef5f5e7529d5" path="/var/lib/kubelet/pods/03134986-d922-410f-a220-ef5f5e7529d5/volumes" Jan 30 19:37:07 crc kubenswrapper[4782]: I0130 19:37:07.410667 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:37:07 crc kubenswrapper[4782]: E0130 19:37:07.411276 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:37:18 crc kubenswrapper[4782]: I0130 19:37:18.411168 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:37:18 crc kubenswrapper[4782]: E0130 19:37:18.412096 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:37:29 crc kubenswrapper[4782]: I0130 19:37:29.411968 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:37:29 crc kubenswrapper[4782]: E0130 19:37:29.413175 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:37:41 crc kubenswrapper[4782]: I0130 19:37:41.410715 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:37:41 crc kubenswrapper[4782]: E0130 19:37:41.412975 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:37:56 crc kubenswrapper[4782]: I0130 19:37:56.410931 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:37:56 crc kubenswrapper[4782]: E0130 19:37:56.411730 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:38:08 crc kubenswrapper[4782]: I0130 19:38:08.411065 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:38:08 crc kubenswrapper[4782]: E0130 19:38:08.412133 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:38:20 crc kubenswrapper[4782]: I0130 19:38:20.411910 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:38:20 crc kubenswrapper[4782]: E0130 19:38:20.413074 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:38:34 crc kubenswrapper[4782]: I0130 19:38:34.424781 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:38:34 crc kubenswrapper[4782]: E0130 19:38:34.425816 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:38:46 crc kubenswrapper[4782]: I0130 19:38:46.413090 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:38:46 crc kubenswrapper[4782]: E0130 19:38:46.416056 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:39:00 crc kubenswrapper[4782]: I0130 19:39:00.413707 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:39:00 crc kubenswrapper[4782]: E0130 19:39:00.416345 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:39:11 crc kubenswrapper[4782]: I0130 19:39:11.410336 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:39:11 crc kubenswrapper[4782]: E0130 19:39:11.411104 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:39:23 crc kubenswrapper[4782]: I0130 19:39:23.412004 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:39:24 crc kubenswrapper[4782]: I0130 19:39:24.335155 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21"} Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.060519 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:39:33 crc kubenswrapper[4782]: E0130 19:39:33.061532 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="extract-utilities" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.061546 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="extract-utilities" Jan 30 19:39:33 crc kubenswrapper[4782]: E0130 19:39:33.061575 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="extract-content" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.061581 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="extract-content" Jan 30 19:39:33 crc kubenswrapper[4782]: E0130 19:39:33.061589 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="registry-server" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.061595 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="registry-server" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.061803 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="03134986-d922-410f-a220-ef5f5e7529d5" containerName="registry-server" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.063201 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.098282 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.222661 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.222718 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.222802 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znng8\" (UniqueName: \"kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.324373 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.324444 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.324495 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znng8\" (UniqueName: \"kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.324903 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.325043 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.351543 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znng8\" (UniqueName: \"kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8\") pod \"redhat-operators-xdwvx\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.385593 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:33 crc kubenswrapper[4782]: I0130 19:39:33.863787 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:39:34 crc kubenswrapper[4782]: I0130 19:39:34.452467 4782 generic.go:334] "Generic (PLEG): container finished" podID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerID="a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b" exitCode=0 Jan 30 19:39:34 crc kubenswrapper[4782]: I0130 19:39:34.452736 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerDied","Data":"a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b"} Jan 30 19:39:34 crc kubenswrapper[4782]: I0130 19:39:34.452766 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerStarted","Data":"2fec449e7e8cfdf913b094de8090349f6d4be5e00da80300fc39a7b48d752821"} Jan 30 19:39:35 crc kubenswrapper[4782]: I0130 19:39:35.466180 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerStarted","Data":"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9"} Jan 30 19:39:40 crc kubenswrapper[4782]: I0130 19:39:40.521771 4782 generic.go:334] "Generic (PLEG): container finished" podID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerID="1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9" exitCode=0 Jan 30 19:39:40 crc kubenswrapper[4782]: I0130 19:39:40.521993 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerDied","Data":"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9"} Jan 30 19:39:41 crc kubenswrapper[4782]: I0130 19:39:41.538811 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerStarted","Data":"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1"} Jan 30 19:39:41 crc kubenswrapper[4782]: I0130 19:39:41.574325 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdwvx" podStartSLOduration=2.120835188 podStartE2EDuration="8.574295977s" podCreationTimestamp="2026-01-30 19:39:33 +0000 UTC" firstStartedPulling="2026-01-30 19:39:34.454513139 +0000 UTC m=+4150.722891164" lastFinishedPulling="2026-01-30 19:39:40.907973928 +0000 UTC m=+4157.176351953" observedRunningTime="2026-01-30 19:39:41.563629863 +0000 UTC m=+4157.832007918" watchObservedRunningTime="2026-01-30 19:39:41.574295977 +0000 UTC m=+4157.842674042" Jan 30 19:39:43 crc kubenswrapper[4782]: I0130 19:39:43.386895 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:43 crc kubenswrapper[4782]: I0130 19:39:43.387892 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:39:44 crc kubenswrapper[4782]: I0130 19:39:44.475721 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdwvx" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" probeResult="failure" output=< Jan 30 19:39:44 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:39:44 crc kubenswrapper[4782]: > Jan 30 19:39:54 crc kubenswrapper[4782]: I0130 19:39:54.468943 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdwvx" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" probeResult="failure" output=< Jan 30 19:39:54 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:39:54 crc kubenswrapper[4782]: > Jan 30 19:40:03 crc kubenswrapper[4782]: I0130 19:40:03.467550 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:40:03 crc kubenswrapper[4782]: I0130 19:40:03.546369 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:40:04 crc kubenswrapper[4782]: I0130 19:40:04.277895 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:40:04 crc kubenswrapper[4782]: I0130 19:40:04.809192 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xdwvx" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" containerID="cri-o://f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1" gracePeriod=2 Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.305929 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.375084 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content\") pod \"90ea8feb-d5d1-4de5-a813-506d0509138c\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.375214 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities\") pod \"90ea8feb-d5d1-4de5-a813-506d0509138c\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.375310 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znng8\" (UniqueName: \"kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8\") pod \"90ea8feb-d5d1-4de5-a813-506d0509138c\" (UID: \"90ea8feb-d5d1-4de5-a813-506d0509138c\") " Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.376182 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities" (OuterVolumeSpecName: "utilities") pod "90ea8feb-d5d1-4de5-a813-506d0509138c" (UID: "90ea8feb-d5d1-4de5-a813-506d0509138c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.386504 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8" (OuterVolumeSpecName: "kube-api-access-znng8") pod "90ea8feb-d5d1-4de5-a813-506d0509138c" (UID: "90ea8feb-d5d1-4de5-a813-506d0509138c"). InnerVolumeSpecName "kube-api-access-znng8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.477935 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.477974 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znng8\" (UniqueName: \"kubernetes.io/projected/90ea8feb-d5d1-4de5-a813-506d0509138c-kube-api-access-znng8\") on node \"crc\" DevicePath \"\"" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.501403 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90ea8feb-d5d1-4de5-a813-506d0509138c" (UID: "90ea8feb-d5d1-4de5-a813-506d0509138c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.580523 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90ea8feb-d5d1-4de5-a813-506d0509138c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.819049 4782 generic.go:334] "Generic (PLEG): container finished" podID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerID="f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1" exitCode=0 Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.819088 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerDied","Data":"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1"} Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.819113 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdwvx" event={"ID":"90ea8feb-d5d1-4de5-a813-506d0509138c","Type":"ContainerDied","Data":"2fec449e7e8cfdf913b094de8090349f6d4be5e00da80300fc39a7b48d752821"} Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.819130 4782 scope.go:117] "RemoveContainer" containerID="f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.819272 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdwvx" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.851943 4782 scope.go:117] "RemoveContainer" containerID="1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.863124 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.875099 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xdwvx"] Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.890978 4782 scope.go:117] "RemoveContainer" containerID="a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.945004 4782 scope.go:117] "RemoveContainer" containerID="f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1" Jan 30 19:40:05 crc kubenswrapper[4782]: E0130 19:40:05.945516 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1\": container with ID starting with f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1 not found: ID does not exist" containerID="f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.945577 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1"} err="failed to get container status \"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1\": rpc error: code = NotFound desc = could not find container \"f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1\": container with ID starting with f6569a5fd5c67c313643901f2a3b07d383fa195612290310ea6b1169f1919de1 not found: ID does not exist" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.945604 4782 scope.go:117] "RemoveContainer" containerID="1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9" Jan 30 19:40:05 crc kubenswrapper[4782]: E0130 19:40:05.945971 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9\": container with ID starting with 1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9 not found: ID does not exist" containerID="1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.946031 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9"} err="failed to get container status \"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9\": rpc error: code = NotFound desc = could not find container \"1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9\": container with ID starting with 1ff572d4925092f7794d0915464fe95085c71f1abc7abad76c456df9ceb363c9 not found: ID does not exist" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.946050 4782 scope.go:117] "RemoveContainer" containerID="a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b" Jan 30 19:40:05 crc kubenswrapper[4782]: E0130 19:40:05.946343 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b\": container with ID starting with a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b not found: ID does not exist" containerID="a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b" Jan 30 19:40:05 crc kubenswrapper[4782]: I0130 19:40:05.946377 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b"} err="failed to get container status \"a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b\": rpc error: code = NotFound desc = could not find container \"a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b\": container with ID starting with a3e0a670ffba23fbb46e7bfc8352da098f232e5d0d4798e286a65600c106da8b not found: ID does not exist" Jan 30 19:40:06 crc kubenswrapper[4782]: I0130 19:40:06.423377 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" path="/var/lib/kubelet/pods/90ea8feb-d5d1-4de5-a813-506d0509138c/volumes" Jan 30 19:41:49 crc kubenswrapper[4782]: I0130 19:41:49.792950 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:41:49 crc kubenswrapper[4782]: I0130 19:41:49.793548 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:42:19 crc kubenswrapper[4782]: I0130 19:42:19.792650 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:42:19 crc kubenswrapper[4782]: I0130 19:42:19.793626 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:42:49 crc kubenswrapper[4782]: I0130 19:42:49.792502 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:42:49 crc kubenswrapper[4782]: I0130 19:42:49.793185 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:42:49 crc kubenswrapper[4782]: I0130 19:42:49.793285 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:42:49 crc kubenswrapper[4782]: I0130 19:42:49.794473 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:42:49 crc kubenswrapper[4782]: I0130 19:42:49.794581 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21" gracePeriod=600 Jan 30 19:42:50 crc kubenswrapper[4782]: I0130 19:42:50.734878 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21" exitCode=0 Jan 30 19:42:50 crc kubenswrapper[4782]: I0130 19:42:50.734961 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21"} Jan 30 19:42:50 crc kubenswrapper[4782]: I0130 19:42:50.735376 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d"} Jan 30 19:42:50 crc kubenswrapper[4782]: I0130 19:42:50.735398 4782 scope.go:117] "RemoveContainer" containerID="94a3d1432ceda35db41617242af3918b88d8eb8e691bf52acfa53a9e2d1397ae" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.153779 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:43:59 crc kubenswrapper[4782]: E0130 19:43:59.155595 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="extract-utilities" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.155633 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="extract-utilities" Jan 30 19:43:59 crc kubenswrapper[4782]: E0130 19:43:59.155708 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.155729 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" Jan 30 19:43:59 crc kubenswrapper[4782]: E0130 19:43:59.155770 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="extract-content" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.155790 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="extract-content" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.156371 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ea8feb-d5d1-4de5-a813-506d0509138c" containerName="registry-server" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.171060 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.172544 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.257037 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfpc\" (UniqueName: \"kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.257118 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.257186 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.359433 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfpc\" (UniqueName: \"kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.359516 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.359578 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.360080 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.360177 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.387418 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfpc\" (UniqueName: \"kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc\") pod \"community-operators-lvxjx\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:43:59 crc kubenswrapper[4782]: I0130 19:43:59.503498 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:00 crc kubenswrapper[4782]: I0130 19:44:00.089632 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:44:00 crc kubenswrapper[4782]: I0130 19:44:00.543491 4782 generic.go:334] "Generic (PLEG): container finished" podID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerID="c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a" exitCode=0 Jan 30 19:44:00 crc kubenswrapper[4782]: I0130 19:44:00.543549 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerDied","Data":"c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a"} Jan 30 19:44:00 crc kubenswrapper[4782]: I0130 19:44:00.543778 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerStarted","Data":"b8a6279ef2bf4f009c82993469eec8ce99fc068878e1887adba4d577774a78a2"} Jan 30 19:44:00 crc kubenswrapper[4782]: I0130 19:44:00.546015 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:44:01 crc kubenswrapper[4782]: I0130 19:44:01.554423 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerStarted","Data":"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71"} Jan 30 19:44:03 crc kubenswrapper[4782]: I0130 19:44:03.580720 4782 generic.go:334] "Generic (PLEG): container finished" podID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerID="062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71" exitCode=0 Jan 30 19:44:03 crc kubenswrapper[4782]: I0130 19:44:03.580769 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerDied","Data":"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71"} Jan 30 19:44:04 crc kubenswrapper[4782]: I0130 19:44:04.594401 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerStarted","Data":"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914"} Jan 30 19:44:04 crc kubenswrapper[4782]: I0130 19:44:04.618971 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lvxjx" podStartSLOduration=2.140466615 podStartE2EDuration="5.618950027s" podCreationTimestamp="2026-01-30 19:43:59 +0000 UTC" firstStartedPulling="2026-01-30 19:44:00.545717445 +0000 UTC m=+4416.814095480" lastFinishedPulling="2026-01-30 19:44:04.024200867 +0000 UTC m=+4420.292578892" observedRunningTime="2026-01-30 19:44:04.611105043 +0000 UTC m=+4420.879483068" watchObservedRunningTime="2026-01-30 19:44:04.618950027 +0000 UTC m=+4420.887328052" Jan 30 19:44:09 crc kubenswrapper[4782]: I0130 19:44:09.505042 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:09 crc kubenswrapper[4782]: I0130 19:44:09.506404 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:09 crc kubenswrapper[4782]: I0130 19:44:09.597884 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:09 crc kubenswrapper[4782]: I0130 19:44:09.691584 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:09 crc kubenswrapper[4782]: I0130 19:44:09.842760 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:44:11 crc kubenswrapper[4782]: I0130 19:44:11.663132 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lvxjx" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="registry-server" containerID="cri-o://16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914" gracePeriod=2 Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.170703 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.241565 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfpc\" (UniqueName: \"kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc\") pod \"4692d577-b2e2-453f-bbf2-6663d36d9be3\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.241681 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities\") pod \"4692d577-b2e2-453f-bbf2-6663d36d9be3\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.241976 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content\") pod \"4692d577-b2e2-453f-bbf2-6663d36d9be3\" (UID: \"4692d577-b2e2-453f-bbf2-6663d36d9be3\") " Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.242860 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities" (OuterVolumeSpecName: "utilities") pod "4692d577-b2e2-453f-bbf2-6663d36d9be3" (UID: "4692d577-b2e2-453f-bbf2-6663d36d9be3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.254189 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc" (OuterVolumeSpecName: "kube-api-access-gjfpc") pod "4692d577-b2e2-453f-bbf2-6663d36d9be3" (UID: "4692d577-b2e2-453f-bbf2-6663d36d9be3"). InnerVolumeSpecName "kube-api-access-gjfpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.344850 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjfpc\" (UniqueName: \"kubernetes.io/projected/4692d577-b2e2-453f-bbf2-6663d36d9be3-kube-api-access-gjfpc\") on node \"crc\" DevicePath \"\"" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.344878 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.357409 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4692d577-b2e2-453f-bbf2-6663d36d9be3" (UID: "4692d577-b2e2-453f-bbf2-6663d36d9be3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.446525 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4692d577-b2e2-453f-bbf2-6663d36d9be3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.684121 4782 generic.go:334] "Generic (PLEG): container finished" podID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerID="16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914" exitCode=0 Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.684584 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerDied","Data":"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914"} Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.684627 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lvxjx" event={"ID":"4692d577-b2e2-453f-bbf2-6663d36d9be3","Type":"ContainerDied","Data":"b8a6279ef2bf4f009c82993469eec8ce99fc068878e1887adba4d577774a78a2"} Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.684651 4782 scope.go:117] "RemoveContainer" containerID="16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.684877 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lvxjx" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.717550 4782 scope.go:117] "RemoveContainer" containerID="062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.720460 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.737619 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lvxjx"] Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.742978 4782 scope.go:117] "RemoveContainer" containerID="c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.813535 4782 scope.go:117] "RemoveContainer" containerID="16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914" Jan 30 19:44:12 crc kubenswrapper[4782]: E0130 19:44:12.813966 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914\": container with ID starting with 16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914 not found: ID does not exist" containerID="16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.814008 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914"} err="failed to get container status \"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914\": rpc error: code = NotFound desc = could not find container \"16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914\": container with ID starting with 16e0c8a1db59264a224b57782b84950b4d9a66aa59fd38ef6423cb02087e6914 not found: ID does not exist" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.814033 4782 scope.go:117] "RemoveContainer" containerID="062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71" Jan 30 19:44:12 crc kubenswrapper[4782]: E0130 19:44:12.814418 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71\": container with ID starting with 062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71 not found: ID does not exist" containerID="062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.814509 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71"} err="failed to get container status \"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71\": rpc error: code = NotFound desc = could not find container \"062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71\": container with ID starting with 062b5762fa4ef42359f669be6a5efbe85cd590e78e116295bcdfff34de19ac71 not found: ID does not exist" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.814580 4782 scope.go:117] "RemoveContainer" containerID="c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a" Jan 30 19:44:12 crc kubenswrapper[4782]: E0130 19:44:12.815031 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a\": container with ID starting with c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a not found: ID does not exist" containerID="c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a" Jan 30 19:44:12 crc kubenswrapper[4782]: I0130 19:44:12.815066 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a"} err="failed to get container status \"c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a\": rpc error: code = NotFound desc = could not find container \"c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a\": container with ID starting with c7cb1689b686b63a930680855c1b2c58ff4ea8694d944d6d9b7ed6f5f6a7267a not found: ID does not exist" Jan 30 19:44:14 crc kubenswrapper[4782]: I0130 19:44:14.430766 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" path="/var/lib/kubelet/pods/4692d577-b2e2-453f-bbf2-6663d36d9be3/volumes" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.187750 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7"] Jan 30 19:45:00 crc kubenswrapper[4782]: E0130 19:45:00.189045 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="registry-server" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.189083 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="registry-server" Jan 30 19:45:00 crc kubenswrapper[4782]: E0130 19:45:00.189121 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="extract-utilities" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.189135 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="extract-utilities" Jan 30 19:45:00 crc kubenswrapper[4782]: E0130 19:45:00.189163 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="extract-content" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.189178 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="extract-content" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.189602 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="4692d577-b2e2-453f-bbf2-6663d36d9be3" containerName="registry-server" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.190889 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.194485 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.199160 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.217269 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7"] Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.344053 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5p95\" (UniqueName: \"kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.344245 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.344308 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.447913 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5p95\" (UniqueName: \"kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.448090 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.448161 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.450139 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.466174 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.474947 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5p95\" (UniqueName: \"kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95\") pod \"collect-profiles-29496705-mr5m7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.518567 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:00 crc kubenswrapper[4782]: I0130 19:45:00.965964 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7"] Jan 30 19:45:01 crc kubenswrapper[4782]: I0130 19:45:01.255219 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" event={"ID":"dc022036-0478-4fa0-99e2-65c00a67d8f7","Type":"ContainerStarted","Data":"85226bd5ffeb99b82edf5a52efa128a0ba5466077852066b10ad216b423dea8f"} Jan 30 19:45:01 crc kubenswrapper[4782]: I0130 19:45:01.255318 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" event={"ID":"dc022036-0478-4fa0-99e2-65c00a67d8f7","Type":"ContainerStarted","Data":"1e65ec38b5dc1cb292cfee84b512e2c6c2fcca0f19dbb7e490970865e8fe9dc0"} Jan 30 19:45:01 crc kubenswrapper[4782]: I0130 19:45:01.279144 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" podStartSLOduration=1.279122402 podStartE2EDuration="1.279122402s" podCreationTimestamp="2026-01-30 19:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 19:45:01.272328744 +0000 UTC m=+4477.540706779" watchObservedRunningTime="2026-01-30 19:45:01.279122402 +0000 UTC m=+4477.547500427" Jan 30 19:45:02 crc kubenswrapper[4782]: I0130 19:45:02.268071 4782 generic.go:334] "Generic (PLEG): container finished" podID="dc022036-0478-4fa0-99e2-65c00a67d8f7" containerID="85226bd5ffeb99b82edf5a52efa128a0ba5466077852066b10ad216b423dea8f" exitCode=0 Jan 30 19:45:02 crc kubenswrapper[4782]: I0130 19:45:02.268145 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" event={"ID":"dc022036-0478-4fa0-99e2-65c00a67d8f7","Type":"ContainerDied","Data":"85226bd5ffeb99b82edf5a52efa128a0ba5466077852066b10ad216b423dea8f"} Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.661072 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.731220 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume\") pod \"dc022036-0478-4fa0-99e2-65c00a67d8f7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.731348 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume\") pod \"dc022036-0478-4fa0-99e2-65c00a67d8f7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.732269 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc022036-0478-4fa0-99e2-65c00a67d8f7" (UID: "dc022036-0478-4fa0-99e2-65c00a67d8f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.732546 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5p95\" (UniqueName: \"kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95\") pod \"dc022036-0478-4fa0-99e2-65c00a67d8f7\" (UID: \"dc022036-0478-4fa0-99e2-65c00a67d8f7\") " Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.733887 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc022036-0478-4fa0-99e2-65c00a67d8f7-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.739518 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95" (OuterVolumeSpecName: "kube-api-access-d5p95") pod "dc022036-0478-4fa0-99e2-65c00a67d8f7" (UID: "dc022036-0478-4fa0-99e2-65c00a67d8f7"). InnerVolumeSpecName "kube-api-access-d5p95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.752110 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc022036-0478-4fa0-99e2-65c00a67d8f7" (UID: "dc022036-0478-4fa0-99e2-65c00a67d8f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.836737 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5p95\" (UniqueName: \"kubernetes.io/projected/dc022036-0478-4fa0-99e2-65c00a67d8f7-kube-api-access-d5p95\") on node \"crc\" DevicePath \"\"" Jan 30 19:45:03 crc kubenswrapper[4782]: I0130 19:45:03.837074 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc022036-0478-4fa0-99e2-65c00a67d8f7-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.296002 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" event={"ID":"dc022036-0478-4fa0-99e2-65c00a67d8f7","Type":"ContainerDied","Data":"1e65ec38b5dc1cb292cfee84b512e2c6c2fcca0f19dbb7e490970865e8fe9dc0"} Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.296052 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496705-mr5m7" Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.296066 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e65ec38b5dc1cb292cfee84b512e2c6c2fcca0f19dbb7e490970865e8fe9dc0" Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.365936 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b"] Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.378172 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496660-7kl9b"] Jan 30 19:45:04 crc kubenswrapper[4782]: I0130 19:45:04.425980 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad8992c9-9e40-46e3-9c11-d70d863f01c8" path="/var/lib/kubelet/pods/ad8992c9-9e40-46e3-9c11-d70d863f01c8/volumes" Jan 30 19:45:19 crc kubenswrapper[4782]: I0130 19:45:19.793414 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:45:19 crc kubenswrapper[4782]: I0130 19:45:19.794054 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:45:36 crc kubenswrapper[4782]: I0130 19:45:36.275027 4782 scope.go:117] "RemoveContainer" containerID="a19ad96b7b4446d607ed9cb46070d676a916204bbc339abdd0da171ff7b673fb" Jan 30 19:45:49 crc kubenswrapper[4782]: I0130 19:45:49.793014 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:45:49 crc kubenswrapper[4782]: I0130 19:45:49.793527 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:46:19 crc kubenswrapper[4782]: I0130 19:46:19.793305 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:46:19 crc kubenswrapper[4782]: I0130 19:46:19.794029 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:46:19 crc kubenswrapper[4782]: I0130 19:46:19.794094 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:46:19 crc kubenswrapper[4782]: I0130 19:46:19.795384 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:46:19 crc kubenswrapper[4782]: I0130 19:46:19.795507 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" gracePeriod=600 Jan 30 19:46:19 crc kubenswrapper[4782]: E0130 19:46:19.917575 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:46:20 crc kubenswrapper[4782]: I0130 19:46:20.207314 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" exitCode=0 Jan 30 19:46:20 crc kubenswrapper[4782]: I0130 19:46:20.207643 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d"} Jan 30 19:46:20 crc kubenswrapper[4782]: I0130 19:46:20.207675 4782 scope.go:117] "RemoveContainer" containerID="a1e6413f5218f1d431a598d9de2283dc83d8895b91f555eddecade8fb6f54c21" Jan 30 19:46:20 crc kubenswrapper[4782]: I0130 19:46:20.208439 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:46:20 crc kubenswrapper[4782]: E0130 19:46:20.208732 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:46:31 crc kubenswrapper[4782]: I0130 19:46:31.411451 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:46:31 crc kubenswrapper[4782]: E0130 19:46:31.412522 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:46:43 crc kubenswrapper[4782]: I0130 19:46:43.410863 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:46:43 crc kubenswrapper[4782]: E0130 19:46:43.411834 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:46:57 crc kubenswrapper[4782]: I0130 19:46:57.410680 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:46:57 crc kubenswrapper[4782]: E0130 19:46:57.411437 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:47:12 crc kubenswrapper[4782]: I0130 19:47:12.413456 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:47:12 crc kubenswrapper[4782]: E0130 19:47:12.414865 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:47:23 crc kubenswrapper[4782]: I0130 19:47:23.411607 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:47:23 crc kubenswrapper[4782]: E0130 19:47:23.412927 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:47:34 crc kubenswrapper[4782]: I0130 19:47:34.424257 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:47:34 crc kubenswrapper[4782]: E0130 19:47:34.424993 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:47:49 crc kubenswrapper[4782]: I0130 19:47:49.411879 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:47:49 crc kubenswrapper[4782]: E0130 19:47:49.412776 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:48:01 crc kubenswrapper[4782]: I0130 19:48:01.411056 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:48:01 crc kubenswrapper[4782]: E0130 19:48:01.412398 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.425819 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:10 crc kubenswrapper[4782]: E0130 19:48:10.426840 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc022036-0478-4fa0-99e2-65c00a67d8f7" containerName="collect-profiles" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.426858 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc022036-0478-4fa0-99e2-65c00a67d8f7" containerName="collect-profiles" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.427125 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc022036-0478-4fa0-99e2-65c00a67d8f7" containerName="collect-profiles" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.428873 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.436990 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.580394 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.581263 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4jr\" (UniqueName: \"kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.581434 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.683267 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4jr\" (UniqueName: \"kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.683372 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.683434 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.683941 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.683972 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.706940 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4jr\" (UniqueName: \"kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr\") pod \"certified-operators-tskt9\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:10 crc kubenswrapper[4782]: I0130 19:48:10.764416 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:11 crc kubenswrapper[4782]: I0130 19:48:11.750747 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:12 crc kubenswrapper[4782]: I0130 19:48:12.475979 4782 generic.go:334] "Generic (PLEG): container finished" podID="110123b2-864e-4162-b68d-80941938d6fb" containerID="98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a" exitCode=0 Jan 30 19:48:12 crc kubenswrapper[4782]: I0130 19:48:12.476063 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerDied","Data":"98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a"} Jan 30 19:48:12 crc kubenswrapper[4782]: I0130 19:48:12.476266 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerStarted","Data":"ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48"} Jan 30 19:48:13 crc kubenswrapper[4782]: I0130 19:48:13.411295 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:48:13 crc kubenswrapper[4782]: E0130 19:48:13.411573 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:48:14 crc kubenswrapper[4782]: I0130 19:48:14.503385 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerStarted","Data":"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586"} Jan 30 19:48:15 crc kubenswrapper[4782]: I0130 19:48:15.515829 4782 generic.go:334] "Generic (PLEG): container finished" podID="110123b2-864e-4162-b68d-80941938d6fb" containerID="a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586" exitCode=0 Jan 30 19:48:15 crc kubenswrapper[4782]: I0130 19:48:15.515949 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerDied","Data":"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586"} Jan 30 19:48:16 crc kubenswrapper[4782]: I0130 19:48:16.529919 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerStarted","Data":"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb"} Jan 30 19:48:16 crc kubenswrapper[4782]: I0130 19:48:16.562878 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tskt9" podStartSLOduration=3.122232145 podStartE2EDuration="6.562853721s" podCreationTimestamp="2026-01-30 19:48:10 +0000 UTC" firstStartedPulling="2026-01-30 19:48:12.478592465 +0000 UTC m=+4668.746970500" lastFinishedPulling="2026-01-30 19:48:15.919214051 +0000 UTC m=+4672.187592076" observedRunningTime="2026-01-30 19:48:16.551394378 +0000 UTC m=+4672.819772413" watchObservedRunningTime="2026-01-30 19:48:16.562853721 +0000 UTC m=+4672.831231766" Jan 30 19:48:20 crc kubenswrapper[4782]: I0130 19:48:20.765437 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:20 crc kubenswrapper[4782]: I0130 19:48:20.765727 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:20 crc kubenswrapper[4782]: I0130 19:48:20.831525 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:21 crc kubenswrapper[4782]: I0130 19:48:21.662363 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:21 crc kubenswrapper[4782]: I0130 19:48:21.728497 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:23 crc kubenswrapper[4782]: I0130 19:48:23.615402 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tskt9" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="registry-server" containerID="cri-o://d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb" gracePeriod=2 Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.179440 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.262984 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h4jr\" (UniqueName: \"kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr\") pod \"110123b2-864e-4162-b68d-80941938d6fb\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.263206 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content\") pod \"110123b2-864e-4162-b68d-80941938d6fb\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.263406 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities\") pod \"110123b2-864e-4162-b68d-80941938d6fb\" (UID: \"110123b2-864e-4162-b68d-80941938d6fb\") " Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.264517 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities" (OuterVolumeSpecName: "utilities") pod "110123b2-864e-4162-b68d-80941938d6fb" (UID: "110123b2-864e-4162-b68d-80941938d6fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.270949 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr" (OuterVolumeSpecName: "kube-api-access-4h4jr") pod "110123b2-864e-4162-b68d-80941938d6fb" (UID: "110123b2-864e-4162-b68d-80941938d6fb"). InnerVolumeSpecName "kube-api-access-4h4jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.332391 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "110123b2-864e-4162-b68d-80941938d6fb" (UID: "110123b2-864e-4162-b68d-80941938d6fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.365773 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.365819 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/110123b2-864e-4162-b68d-80941938d6fb-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.365833 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h4jr\" (UniqueName: \"kubernetes.io/projected/110123b2-864e-4162-b68d-80941938d6fb-kube-api-access-4h4jr\") on node \"crc\" DevicePath \"\"" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.429594 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:48:24 crc kubenswrapper[4782]: E0130 19:48:24.430883 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.631980 4782 generic.go:334] "Generic (PLEG): container finished" podID="110123b2-864e-4162-b68d-80941938d6fb" containerID="d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb" exitCode=0 Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.632050 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerDied","Data":"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb"} Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.632072 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tskt9" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.632091 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tskt9" event={"ID":"110123b2-864e-4162-b68d-80941938d6fb","Type":"ContainerDied","Data":"ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48"} Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.632120 4782 scope.go:117] "RemoveContainer" containerID="d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.675050 4782 scope.go:117] "RemoveContainer" containerID="a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586" Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.676277 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:24 crc kubenswrapper[4782]: I0130 19:48:24.689397 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tskt9"] Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.298725 4782 scope.go:117] "RemoveContainer" containerID="98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.441692 4782 scope.go:117] "RemoveContainer" containerID="d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb" Jan 30 19:48:25 crc kubenswrapper[4782]: E0130 19:48:25.442332 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb\": container with ID starting with d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb not found: ID does not exist" containerID="d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.442375 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb"} err="failed to get container status \"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb\": rpc error: code = NotFound desc = could not find container \"d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb\": container with ID starting with d187a3f07caad1e8427cf7f01ea9eaa98775bd9ebcef59eb169ff154f64e4bdb not found: ID does not exist" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.442403 4782 scope.go:117] "RemoveContainer" containerID="a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586" Jan 30 19:48:25 crc kubenswrapper[4782]: E0130 19:48:25.442919 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586\": container with ID starting with a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586 not found: ID does not exist" containerID="a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.442969 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586"} err="failed to get container status \"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586\": rpc error: code = NotFound desc = could not find container \"a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586\": container with ID starting with a99f6ed04c11f83821f248082233eb660e12f9cfdb0d5553f283b279cc04b586 not found: ID does not exist" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.442998 4782 scope.go:117] "RemoveContainer" containerID="98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a" Jan 30 19:48:25 crc kubenswrapper[4782]: E0130 19:48:25.444463 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a\": container with ID starting with 98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a not found: ID does not exist" containerID="98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a" Jan 30 19:48:25 crc kubenswrapper[4782]: I0130 19:48:25.444492 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a"} err="failed to get container status \"98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a\": rpc error: code = NotFound desc = could not find container \"98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a\": container with ID starting with 98e2ce163c4f05272aa2244fabc77d0dc20d0d35b2e06ba06b03702ba526cf2a not found: ID does not exist" Jan 30 19:48:26 crc kubenswrapper[4782]: I0130 19:48:26.421802 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110123b2-864e-4162-b68d-80941938d6fb" path="/var/lib/kubelet/pods/110123b2-864e-4162-b68d-80941938d6fb/volumes" Jan 30 19:48:29 crc kubenswrapper[4782]: E0130 19:48:29.504580 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache]" Jan 30 19:48:35 crc kubenswrapper[4782]: I0130 19:48:35.411472 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:48:35 crc kubenswrapper[4782]: E0130 19:48:35.412686 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:48:39 crc kubenswrapper[4782]: E0130 19:48:39.775749 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache]" Jan 30 19:48:50 crc kubenswrapper[4782]: E0130 19:48:50.026059 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache]" Jan 30 19:48:50 crc kubenswrapper[4782]: I0130 19:48:50.412001 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:48:50 crc kubenswrapper[4782]: E0130 19:48:50.412586 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:49:00 crc kubenswrapper[4782]: E0130 19:49:00.311540 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache]" Jan 30 19:49:04 crc kubenswrapper[4782]: I0130 19:49:04.425547 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:49:04 crc kubenswrapper[4782]: E0130 19:49:04.426310 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:49:10 crc kubenswrapper[4782]: E0130 19:49:10.586688 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache]" Jan 30 19:49:16 crc kubenswrapper[4782]: I0130 19:49:16.412013 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:49:16 crc kubenswrapper[4782]: E0130 19:49:16.413449 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:49:20 crc kubenswrapper[4782]: E0130 19:49:20.853189 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110123b2_864e_4162_b68d_80941938d6fb.slice/crio-ea446986118a609c8270ead57953a380ba197b01b71cf77212a9e3fc4dae2c48\": RecentStats: unable to find data in memory cache]" Jan 30 19:49:24 crc kubenswrapper[4782]: E0130 19:49:24.470516 4782 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/013075f25c36f62ff67a5ab4c2597b18020a02949c410a6e053a27a8be20416b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/013075f25c36f62ff67a5ab4c2597b18020a02949c410a6e053a27a8be20416b/diff: no such file or directory, extraDiskErr: Jan 30 19:49:29 crc kubenswrapper[4782]: I0130 19:49:29.410736 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:49:29 crc kubenswrapper[4782]: E0130 19:49:29.413444 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:49:41 crc kubenswrapper[4782]: I0130 19:49:41.410997 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:49:41 crc kubenswrapper[4782]: E0130 19:49:41.411845 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:49:52 crc kubenswrapper[4782]: I0130 19:49:52.412014 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:49:52 crc kubenswrapper[4782]: E0130 19:49:52.413215 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:03 crc kubenswrapper[4782]: I0130 19:50:03.411494 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:50:03 crc kubenswrapper[4782]: E0130 19:50:03.412333 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:14 crc kubenswrapper[4782]: I0130 19:50:14.411592 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:50:14 crc kubenswrapper[4782]: E0130 19:50:14.412435 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.938597 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:22 crc kubenswrapper[4782]: E0130 19:50:22.940990 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="extract-content" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.941041 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="extract-content" Jan 30 19:50:22 crc kubenswrapper[4782]: E0130 19:50:22.941086 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="registry-server" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.941273 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="registry-server" Jan 30 19:50:22 crc kubenswrapper[4782]: E0130 19:50:22.941293 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="extract-utilities" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.941306 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="extract-utilities" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.941740 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="110123b2-864e-4162-b68d-80941938d6fb" containerName="registry-server" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.944119 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.971400 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.988719 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.988912 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhcq\" (UniqueName: \"kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:22 crc kubenswrapper[4782]: I0130 19:50:22.989506 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.092274 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.092416 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.092485 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhcq\" (UniqueName: \"kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.093042 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.093085 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.152747 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.154947 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.162122 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.164745 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhcq\" (UniqueName: \"kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq\") pod \"redhat-marketplace-zb5p7\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.201747 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.202025 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvjd\" (UniqueName: \"kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.202094 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.286099 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.304020 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.304455 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvjd\" (UniqueName: \"kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.304510 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.304617 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.304979 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.332134 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvjd\" (UniqueName: \"kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd\") pod \"redhat-operators-zjmxm\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.527666 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:23 crc kubenswrapper[4782]: I0130 19:50:23.809605 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:24 crc kubenswrapper[4782]: I0130 19:50:24.031684 4782 generic.go:334] "Generic (PLEG): container finished" podID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerID="1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733" exitCode=0 Jan 30 19:50:24 crc kubenswrapper[4782]: I0130 19:50:24.031836 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerDied","Data":"1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733"} Jan 30 19:50:24 crc kubenswrapper[4782]: I0130 19:50:24.031948 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerStarted","Data":"6c7bf8ff521fde6f8b97a01508a9661189f6018242507746c7fa220181e40fb8"} Jan 30 19:50:24 crc kubenswrapper[4782]: I0130 19:50:24.034003 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:50:24 crc kubenswrapper[4782]: I0130 19:50:24.070046 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:50:25 crc kubenswrapper[4782]: I0130 19:50:25.043563 4782 generic.go:334] "Generic (PLEG): container finished" podID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerID="2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268" exitCode=0 Jan 30 19:50:25 crc kubenswrapper[4782]: I0130 19:50:25.043889 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerDied","Data":"2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268"} Jan 30 19:50:25 crc kubenswrapper[4782]: I0130 19:50:25.043931 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerStarted","Data":"100119ef6aae621b7185e1b07a1be1cbe887a50b4979239459fad56ecc0f3a2f"} Jan 30 19:50:26 crc kubenswrapper[4782]: I0130 19:50:26.057493 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerStarted","Data":"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a"} Jan 30 19:50:27 crc kubenswrapper[4782]: I0130 19:50:27.069219 4782 generic.go:334] "Generic (PLEG): container finished" podID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerID="6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a" exitCode=0 Jan 30 19:50:27 crc kubenswrapper[4782]: I0130 19:50:27.069279 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerDied","Data":"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a"} Jan 30 19:50:27 crc kubenswrapper[4782]: I0130 19:50:27.074754 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerStarted","Data":"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c"} Jan 30 19:50:28 crc kubenswrapper[4782]: I0130 19:50:28.087285 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerStarted","Data":"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313"} Jan 30 19:50:28 crc kubenswrapper[4782]: I0130 19:50:28.110353 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zb5p7" podStartSLOduration=2.645102212 podStartE2EDuration="6.110332083s" podCreationTimestamp="2026-01-30 19:50:22 +0000 UTC" firstStartedPulling="2026-01-30 19:50:24.033765073 +0000 UTC m=+4800.302143098" lastFinishedPulling="2026-01-30 19:50:27.498994944 +0000 UTC m=+4803.767372969" observedRunningTime="2026-01-30 19:50:28.108983399 +0000 UTC m=+4804.377361434" watchObservedRunningTime="2026-01-30 19:50:28.110332083 +0000 UTC m=+4804.378710098" Jan 30 19:50:28 crc kubenswrapper[4782]: I0130 19:50:28.411121 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:50:28 crc kubenswrapper[4782]: E0130 19:50:28.411738 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:32 crc kubenswrapper[4782]: I0130 19:50:32.133133 4782 generic.go:334] "Generic (PLEG): container finished" podID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerID="683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c" exitCode=0 Jan 30 19:50:32 crc kubenswrapper[4782]: I0130 19:50:32.133185 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerDied","Data":"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c"} Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.146352 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerStarted","Data":"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734"} Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.171200 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjmxm" podStartSLOduration=2.696270856 podStartE2EDuration="10.171181247s" podCreationTimestamp="2026-01-30 19:50:23 +0000 UTC" firstStartedPulling="2026-01-30 19:50:25.047521872 +0000 UTC m=+4801.315899897" lastFinishedPulling="2026-01-30 19:50:32.522432253 +0000 UTC m=+4808.790810288" observedRunningTime="2026-01-30 19:50:33.163939938 +0000 UTC m=+4809.432317973" watchObservedRunningTime="2026-01-30 19:50:33.171181247 +0000 UTC m=+4809.439559272" Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.286508 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.286578 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.355547 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.529021 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:33 crc kubenswrapper[4782]: I0130 19:50:33.529346 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:50:34 crc kubenswrapper[4782]: I0130 19:50:34.205663 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:34 crc kubenswrapper[4782]: I0130 19:50:34.386892 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:35 crc kubenswrapper[4782]: I0130 19:50:35.046382 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjmxm" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" probeResult="failure" output=< Jan 30 19:50:35 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:50:35 crc kubenswrapper[4782]: > Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.172761 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zb5p7" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="registry-server" containerID="cri-o://22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313" gracePeriod=2 Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.708520 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.829066 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content\") pod \"695928ee-e95f-4db1-8e19-c22e0908bdb2\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.829211 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities\") pod \"695928ee-e95f-4db1-8e19-c22e0908bdb2\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.829236 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbhcq\" (UniqueName: \"kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq\") pod \"695928ee-e95f-4db1-8e19-c22e0908bdb2\" (UID: \"695928ee-e95f-4db1-8e19-c22e0908bdb2\") " Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.829658 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities" (OuterVolumeSpecName: "utilities") pod "695928ee-e95f-4db1-8e19-c22e0908bdb2" (UID: "695928ee-e95f-4db1-8e19-c22e0908bdb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.829945 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.835196 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq" (OuterVolumeSpecName: "kube-api-access-gbhcq") pod "695928ee-e95f-4db1-8e19-c22e0908bdb2" (UID: "695928ee-e95f-4db1-8e19-c22e0908bdb2"). InnerVolumeSpecName "kube-api-access-gbhcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.867587 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "695928ee-e95f-4db1-8e19-c22e0908bdb2" (UID: "695928ee-e95f-4db1-8e19-c22e0908bdb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.932135 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbhcq\" (UniqueName: \"kubernetes.io/projected/695928ee-e95f-4db1-8e19-c22e0908bdb2-kube-api-access-gbhcq\") on node \"crc\" DevicePath \"\"" Jan 30 19:50:36 crc kubenswrapper[4782]: I0130 19:50:36.932165 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/695928ee-e95f-4db1-8e19-c22e0908bdb2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.181805 4782 generic.go:334] "Generic (PLEG): container finished" podID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerID="22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313" exitCode=0 Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.181860 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb5p7" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.181865 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerDied","Data":"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313"} Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.181990 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb5p7" event={"ID":"695928ee-e95f-4db1-8e19-c22e0908bdb2","Type":"ContainerDied","Data":"6c7bf8ff521fde6f8b97a01508a9661189f6018242507746c7fa220181e40fb8"} Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.182011 4782 scope.go:117] "RemoveContainer" containerID="22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.211785 4782 scope.go:117] "RemoveContainer" containerID="6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.219879 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.228565 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb5p7"] Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.263502 4782 scope.go:117] "RemoveContainer" containerID="1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.319156 4782 scope.go:117] "RemoveContainer" containerID="22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313" Jan 30 19:50:37 crc kubenswrapper[4782]: E0130 19:50:37.319592 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313\": container with ID starting with 22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313 not found: ID does not exist" containerID="22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.319630 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313"} err="failed to get container status \"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313\": rpc error: code = NotFound desc = could not find container \"22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313\": container with ID starting with 22ed9137d59ff775eb655c22122603d89e2eb84ec80d8140977c1a82f4c3e313 not found: ID does not exist" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.319657 4782 scope.go:117] "RemoveContainer" containerID="6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a" Jan 30 19:50:37 crc kubenswrapper[4782]: E0130 19:50:37.319853 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a\": container with ID starting with 6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a not found: ID does not exist" containerID="6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.319877 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a"} err="failed to get container status \"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a\": rpc error: code = NotFound desc = could not find container \"6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a\": container with ID starting with 6b3cddd1f78d505c7eb6e97f0eba33a1364d61325b6e8eea31f729e3ce3e679a not found: ID does not exist" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.319902 4782 scope.go:117] "RemoveContainer" containerID="1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733" Jan 30 19:50:37 crc kubenswrapper[4782]: E0130 19:50:37.320053 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733\": container with ID starting with 1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733 not found: ID does not exist" containerID="1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733" Jan 30 19:50:37 crc kubenswrapper[4782]: I0130 19:50:37.320074 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733"} err="failed to get container status \"1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733\": rpc error: code = NotFound desc = could not find container \"1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733\": container with ID starting with 1bc6a9bd9cfc576603604a99b10671845f59170a1d45692bdb492edab8e85733 not found: ID does not exist" Jan 30 19:50:38 crc kubenswrapper[4782]: I0130 19:50:38.423664 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" path="/var/lib/kubelet/pods/695928ee-e95f-4db1-8e19-c22e0908bdb2/volumes" Jan 30 19:50:40 crc kubenswrapper[4782]: I0130 19:50:40.413036 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:50:40 crc kubenswrapper[4782]: E0130 19:50:40.413594 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:44 crc kubenswrapper[4782]: I0130 19:50:44.603668 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjmxm" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" probeResult="failure" output=< Jan 30 19:50:44 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:50:44 crc kubenswrapper[4782]: > Jan 30 19:50:52 crc kubenswrapper[4782]: I0130 19:50:52.411902 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:50:52 crc kubenswrapper[4782]: E0130 19:50:52.413055 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:50:54 crc kubenswrapper[4782]: I0130 19:50:54.617541 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zjmxm" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" probeResult="failure" output=< Jan 30 19:50:54 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 19:50:54 crc kubenswrapper[4782]: > Jan 30 19:51:03 crc kubenswrapper[4782]: I0130 19:51:03.596232 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:51:03 crc kubenswrapper[4782]: I0130 19:51:03.653358 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:51:03 crc kubenswrapper[4782]: I0130 19:51:03.856967 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:51:05 crc kubenswrapper[4782]: I0130 19:51:05.521737 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zjmxm" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" containerID="cri-o://34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734" gracePeriod=2 Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.036505 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.092458 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities\") pod \"fbed9d97-69fa-4ed2-81a2-d106843680b1\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.092783 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfvjd\" (UniqueName: \"kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd\") pod \"fbed9d97-69fa-4ed2-81a2-d106843680b1\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.093008 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content\") pod \"fbed9d97-69fa-4ed2-81a2-d106843680b1\" (UID: \"fbed9d97-69fa-4ed2-81a2-d106843680b1\") " Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.093517 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities" (OuterVolumeSpecName: "utilities") pod "fbed9d97-69fa-4ed2-81a2-d106843680b1" (UID: "fbed9d97-69fa-4ed2-81a2-d106843680b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.093844 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.108611 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd" (OuterVolumeSpecName: "kube-api-access-dfvjd") pod "fbed9d97-69fa-4ed2-81a2-d106843680b1" (UID: "fbed9d97-69fa-4ed2-81a2-d106843680b1"). InnerVolumeSpecName "kube-api-access-dfvjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.196504 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfvjd\" (UniqueName: \"kubernetes.io/projected/fbed9d97-69fa-4ed2-81a2-d106843680b1-kube-api-access-dfvjd\") on node \"crc\" DevicePath \"\"" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.219483 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbed9d97-69fa-4ed2-81a2-d106843680b1" (UID: "fbed9d97-69fa-4ed2-81a2-d106843680b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.298208 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbed9d97-69fa-4ed2-81a2-d106843680b1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.536144 4782 generic.go:334] "Generic (PLEG): container finished" podID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerID="34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734" exitCode=0 Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.536213 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerDied","Data":"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734"} Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.536283 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjmxm" event={"ID":"fbed9d97-69fa-4ed2-81a2-d106843680b1","Type":"ContainerDied","Data":"100119ef6aae621b7185e1b07a1be1cbe887a50b4979239459fad56ecc0f3a2f"} Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.536320 4782 scope.go:117] "RemoveContainer" containerID="34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.536501 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjmxm" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.567723 4782 scope.go:117] "RemoveContainer" containerID="683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.569195 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.581265 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zjmxm"] Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.592833 4782 scope.go:117] "RemoveContainer" containerID="2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.657134 4782 scope.go:117] "RemoveContainer" containerID="34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734" Jan 30 19:51:06 crc kubenswrapper[4782]: E0130 19:51:06.658118 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734\": container with ID starting with 34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734 not found: ID does not exist" containerID="34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.658179 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734"} err="failed to get container status \"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734\": rpc error: code = NotFound desc = could not find container \"34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734\": container with ID starting with 34d66e483d827d96c239515d59f5abc9094b306ad3f76c1a5ce1809b6e8f0734 not found: ID does not exist" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.658411 4782 scope.go:117] "RemoveContainer" containerID="683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c" Jan 30 19:51:06 crc kubenswrapper[4782]: E0130 19:51:06.658896 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c\": container with ID starting with 683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c not found: ID does not exist" containerID="683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.658931 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c"} err="failed to get container status \"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c\": rpc error: code = NotFound desc = could not find container \"683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c\": container with ID starting with 683fde1435f631b0b3890f8610c5cddb1c15b33f9c37e5d2922355b2b8fcc77c not found: ID does not exist" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.658952 4782 scope.go:117] "RemoveContainer" containerID="2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268" Jan 30 19:51:06 crc kubenswrapper[4782]: E0130 19:51:06.659458 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268\": container with ID starting with 2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268 not found: ID does not exist" containerID="2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268" Jan 30 19:51:06 crc kubenswrapper[4782]: I0130 19:51:06.659522 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268"} err="failed to get container status \"2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268\": rpc error: code = NotFound desc = could not find container \"2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268\": container with ID starting with 2688a629b30385d25de49c42d1f4eb69e24c3544075a5acc24282f18b6def268 not found: ID does not exist" Jan 30 19:51:07 crc kubenswrapper[4782]: I0130 19:51:07.410637 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:51:07 crc kubenswrapper[4782]: E0130 19:51:07.411308 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:51:08 crc kubenswrapper[4782]: I0130 19:51:08.421366 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" path="/var/lib/kubelet/pods/fbed9d97-69fa-4ed2-81a2-d106843680b1/volumes" Jan 30 19:51:21 crc kubenswrapper[4782]: I0130 19:51:21.410326 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:51:21 crc kubenswrapper[4782]: I0130 19:51:21.698044 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30"} Jan 30 19:53:49 crc kubenswrapper[4782]: I0130 19:53:49.792854 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:53:49 crc kubenswrapper[4782]: I0130 19:53:49.794482 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.754482 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.755920 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.755949 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.755975 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="extract-utilities" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.755987 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="extract-utilities" Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.756012 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="extract-content" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756022 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="extract-content" Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.756042 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="extract-content" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756068 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="extract-content" Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.756094 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="extract-utilities" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756105 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="extract-utilities" Jan 30 19:54:16 crc kubenswrapper[4782]: E0130 19:54:16.756121 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756131 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756473 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="695928ee-e95f-4db1-8e19-c22e0908bdb2" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.756524 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbed9d97-69fa-4ed2-81a2-d106843680b1" containerName="registry-server" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.758833 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.786090 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.924417 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zphtf\" (UniqueName: \"kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.924603 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:16 crc kubenswrapper[4782]: I0130 19:54:16.924752 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.026435 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zphtf\" (UniqueName: \"kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.026619 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.026770 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.027162 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.027295 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.050915 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zphtf\" (UniqueName: \"kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf\") pod \"community-operators-hvccc\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.082331 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.656420 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:17 crc kubenswrapper[4782]: I0130 19:54:17.826404 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerStarted","Data":"ea1bed667b2352579a94adb463ce4f9b36ed093a9b77aa4c2a56d0a034bd9fb6"} Jan 30 19:54:18 crc kubenswrapper[4782]: I0130 19:54:18.837619 4782 generic.go:334] "Generic (PLEG): container finished" podID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerID="9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7" exitCode=0 Jan 30 19:54:18 crc kubenswrapper[4782]: I0130 19:54:18.837682 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerDied","Data":"9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7"} Jan 30 19:54:19 crc kubenswrapper[4782]: I0130 19:54:19.793114 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:54:19 crc kubenswrapper[4782]: I0130 19:54:19.793559 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:54:19 crc kubenswrapper[4782]: I0130 19:54:19.847634 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerStarted","Data":"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76"} Jan 30 19:54:21 crc kubenswrapper[4782]: I0130 19:54:21.872443 4782 generic.go:334] "Generic (PLEG): container finished" podID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerID="1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76" exitCode=0 Jan 30 19:54:21 crc kubenswrapper[4782]: I0130 19:54:21.872557 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerDied","Data":"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76"} Jan 30 19:54:22 crc kubenswrapper[4782]: I0130 19:54:22.885741 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerStarted","Data":"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b"} Jan 30 19:54:22 crc kubenswrapper[4782]: I0130 19:54:22.908173 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvccc" podStartSLOduration=3.451015108 podStartE2EDuration="6.90815309s" podCreationTimestamp="2026-01-30 19:54:16 +0000 UTC" firstStartedPulling="2026-01-30 19:54:18.840171201 +0000 UTC m=+5035.108549226" lastFinishedPulling="2026-01-30 19:54:22.297309183 +0000 UTC m=+5038.565687208" observedRunningTime="2026-01-30 19:54:22.907148756 +0000 UTC m=+5039.175526781" watchObservedRunningTime="2026-01-30 19:54:22.90815309 +0000 UTC m=+5039.176531115" Jan 30 19:54:27 crc kubenswrapper[4782]: I0130 19:54:27.082899 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:27 crc kubenswrapper[4782]: I0130 19:54:27.083647 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:27 crc kubenswrapper[4782]: I0130 19:54:27.168757 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:28 crc kubenswrapper[4782]: I0130 19:54:28.010327 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:28 crc kubenswrapper[4782]: I0130 19:54:28.063150 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:29 crc kubenswrapper[4782]: I0130 19:54:29.972499 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvccc" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="registry-server" containerID="cri-o://c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b" gracePeriod=2 Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.657550 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.740931 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content\") pod \"465a4358-3d29-4564-b906-a17ba4aee1d6\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.740996 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities\") pod \"465a4358-3d29-4564-b906-a17ba4aee1d6\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.741063 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zphtf\" (UniqueName: \"kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf\") pod \"465a4358-3d29-4564-b906-a17ba4aee1d6\" (UID: \"465a4358-3d29-4564-b906-a17ba4aee1d6\") " Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.743433 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities" (OuterVolumeSpecName: "utilities") pod "465a4358-3d29-4564-b906-a17ba4aee1d6" (UID: "465a4358-3d29-4564-b906-a17ba4aee1d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.763333 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf" (OuterVolumeSpecName: "kube-api-access-zphtf") pod "465a4358-3d29-4564-b906-a17ba4aee1d6" (UID: "465a4358-3d29-4564-b906-a17ba4aee1d6"). InnerVolumeSpecName "kube-api-access-zphtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.794676 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "465a4358-3d29-4564-b906-a17ba4aee1d6" (UID: "465a4358-3d29-4564-b906-a17ba4aee1d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.843418 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.843448 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/465a4358-3d29-4564-b906-a17ba4aee1d6-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.843458 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zphtf\" (UniqueName: \"kubernetes.io/projected/465a4358-3d29-4564-b906-a17ba4aee1d6-kube-api-access-zphtf\") on node \"crc\" DevicePath \"\"" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.984684 4782 generic.go:334] "Generic (PLEG): container finished" podID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerID="c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b" exitCode=0 Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.984730 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerDied","Data":"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b"} Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.984762 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvccc" event={"ID":"465a4358-3d29-4564-b906-a17ba4aee1d6","Type":"ContainerDied","Data":"ea1bed667b2352579a94adb463ce4f9b36ed093a9b77aa4c2a56d0a034bd9fb6"} Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.984781 4782 scope.go:117] "RemoveContainer" containerID="c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b" Jan 30 19:54:30 crc kubenswrapper[4782]: I0130 19:54:30.984821 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvccc" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.006956 4782 scope.go:117] "RemoveContainer" containerID="1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.036853 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.053982 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvccc"] Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.054208 4782 scope.go:117] "RemoveContainer" containerID="9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.115456 4782 scope.go:117] "RemoveContainer" containerID="c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b" Jan 30 19:54:31 crc kubenswrapper[4782]: E0130 19:54:31.115964 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b\": container with ID starting with c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b not found: ID does not exist" containerID="c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.116002 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b"} err="failed to get container status \"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b\": rpc error: code = NotFound desc = could not find container \"c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b\": container with ID starting with c42739785683c8aca33eb1e0326a04c1b9200807d28f7990e728fec248db2a3b not found: ID does not exist" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.116027 4782 scope.go:117] "RemoveContainer" containerID="1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76" Jan 30 19:54:31 crc kubenswrapper[4782]: E0130 19:54:31.116348 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76\": container with ID starting with 1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76 not found: ID does not exist" containerID="1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.116377 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76"} err="failed to get container status \"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76\": rpc error: code = NotFound desc = could not find container \"1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76\": container with ID starting with 1acd07b105ebcb88719251102e0e9e85712cbf6e06b53e657dcc818c7bae6e76 not found: ID does not exist" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.116394 4782 scope.go:117] "RemoveContainer" containerID="9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7" Jan 30 19:54:31 crc kubenswrapper[4782]: E0130 19:54:31.116701 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7\": container with ID starting with 9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7 not found: ID does not exist" containerID="9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7" Jan 30 19:54:31 crc kubenswrapper[4782]: I0130 19:54:31.116726 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7"} err="failed to get container status \"9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7\": rpc error: code = NotFound desc = could not find container \"9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7\": container with ID starting with 9c6f1f1a78ab88c8b4e087648ff1d698619032563bd7cd2d092b10fb337774f7 not found: ID does not exist" Jan 30 19:54:32 crc kubenswrapper[4782]: I0130 19:54:32.425568 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" path="/var/lib/kubelet/pods/465a4358-3d29-4564-b906-a17ba4aee1d6/volumes" Jan 30 19:54:49 crc kubenswrapper[4782]: I0130 19:54:49.792657 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:54:49 crc kubenswrapper[4782]: I0130 19:54:49.793284 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:54:49 crc kubenswrapper[4782]: I0130 19:54:49.793336 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:54:49 crc kubenswrapper[4782]: I0130 19:54:49.794106 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:54:49 crc kubenswrapper[4782]: I0130 19:54:49.794158 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30" gracePeriod=600 Jan 30 19:54:49 crc kubenswrapper[4782]: E0130 19:54:49.950418 4782 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eeb02b9_cc00_423a_87f6_2c326af45ceb.slice/crio-conmon-0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30.scope\": RecentStats: unable to find data in memory cache]" Jan 30 19:54:50 crc kubenswrapper[4782]: I0130 19:54:50.191701 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30" exitCode=0 Jan 30 19:54:50 crc kubenswrapper[4782]: I0130 19:54:50.192078 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30"} Jan 30 19:54:50 crc kubenswrapper[4782]: I0130 19:54:50.192128 4782 scope.go:117] "RemoveContainer" containerID="1b110bfd9cea508f5fcfc955b3ea9168bf7e7418506061907d3f6f92e8fd933d" Jan 30 19:54:51 crc kubenswrapper[4782]: I0130 19:54:51.207527 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df"} Jan 30 19:56:51 crc kubenswrapper[4782]: I0130 19:56:51.467329 4782 generic.go:334] "Generic (PLEG): container finished" podID="bd740cf2-1846-4d1e-902e-6ba7a54c0019" containerID="060e6f75c5e0eaf06952daf495fdb6db426d19d4830db6d85957ec0611706f8d" exitCode=0 Jan 30 19:56:51 crc kubenswrapper[4782]: I0130 19:56:51.467400 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd740cf2-1846-4d1e-902e-6ba7a54c0019","Type":"ContainerDied","Data":"060e6f75c5e0eaf06952daf495fdb6db426d19d4830db6d85957ec0611706f8d"} Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.812528 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.948921 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.948994 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949036 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949197 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949278 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949306 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949363 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949394 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndfzw\" (UniqueName: \"kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.949444 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs\") pod \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\" (UID: \"bd740cf2-1846-4d1e-902e-6ba7a54c0019\") " Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.952140 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.952936 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data" (OuterVolumeSpecName: "config-data") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.956546 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.958285 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.967540 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw" (OuterVolumeSpecName: "kube-api-access-ndfzw") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "kube-api-access-ndfzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.987737 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.990877 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:56:52 crc kubenswrapper[4782]: I0130 19:56:52.993952 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.029421 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bd740cf2-1846-4d1e-902e-6ba7a54c0019" (UID: "bd740cf2-1846-4d1e-902e-6ba7a54c0019"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051883 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051915 4782 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051929 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051940 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndfzw\" (UniqueName: \"kubernetes.io/projected/bd740cf2-1846-4d1e-902e-6ba7a54c0019-kube-api-access-ndfzw\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051949 4782 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051981 4782 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051989 4782 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bd740cf2-1846-4d1e-902e-6ba7a54c0019-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.051998 4782 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bd740cf2-1846-4d1e-902e-6ba7a54c0019-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.052006 4782 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/bd740cf2-1846-4d1e-902e-6ba7a54c0019-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.075105 4782 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.153655 4782 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.495354 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"bd740cf2-1846-4d1e-902e-6ba7a54c0019","Type":"ContainerDied","Data":"5f15ce70cff01e25391c19264f218249c42956282f458b6bd43d8f861a39d5ed"} Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.495419 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f15ce70cff01e25391c19264f218249c42956282f458b6bd43d8f861a39d5ed" Jan 30 19:56:53 crc kubenswrapper[4782]: I0130 19:56:53.495427 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.017072 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 19:57:01 crc kubenswrapper[4782]: E0130 19:57:01.018085 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="extract-content" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018104 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="extract-content" Jan 30 19:57:01 crc kubenswrapper[4782]: E0130 19:57:01.018135 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="extract-utilities" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018144 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="extract-utilities" Jan 30 19:57:01 crc kubenswrapper[4782]: E0130 19:57:01.018165 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd740cf2-1846-4d1e-902e-6ba7a54c0019" containerName="tempest-tests-tempest-tests-runner" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018175 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd740cf2-1846-4d1e-902e-6ba7a54c0019" containerName="tempest-tests-tempest-tests-runner" Jan 30 19:57:01 crc kubenswrapper[4782]: E0130 19:57:01.018205 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="registry-server" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018213 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="registry-server" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018464 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="465a4358-3d29-4564-b906-a17ba4aee1d6" containerName="registry-server" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.018485 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd740cf2-1846-4d1e-902e-6ba7a54c0019" containerName="tempest-tests-tempest-tests-runner" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.019359 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.023417 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nw6qc" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.026501 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.054969 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sgf\" (UniqueName: \"kubernetes.io/projected/8b2f36d4-09f5-46f6-9e28-f26004cc80bf-kube-api-access-z8sgf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.055192 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.157790 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sgf\" (UniqueName: \"kubernetes.io/projected/8b2f36d4-09f5-46f6-9e28-f26004cc80bf-kube-api-access-z8sgf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.157918 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.158730 4782 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.191611 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8sgf\" (UniqueName: \"kubernetes.io/projected/8b2f36d4-09f5-46f6-9e28-f26004cc80bf-kube-api-access-z8sgf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.192536 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8b2f36d4-09f5-46f6-9e28-f26004cc80bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.359737 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.875757 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 19:57:01 crc kubenswrapper[4782]: I0130 19:57:01.888539 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 19:57:02 crc kubenswrapper[4782]: I0130 19:57:02.616046 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8b2f36d4-09f5-46f6-9e28-f26004cc80bf","Type":"ContainerStarted","Data":"effd0b9d86a58f4df37c2eb215bb3d7350388730eb2e198c5743fac560f02b0e"} Jan 30 19:57:03 crc kubenswrapper[4782]: I0130 19:57:03.632438 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8b2f36d4-09f5-46f6-9e28-f26004cc80bf","Type":"ContainerStarted","Data":"599b95d389f8eaf47c5a43f854ceba0a7bf73de8d5beef34101e396b0dcde797"} Jan 30 19:57:03 crc kubenswrapper[4782]: I0130 19:57:03.653663 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.816196528 podStartE2EDuration="3.653203344s" podCreationTimestamp="2026-01-30 19:57:00 +0000 UTC" firstStartedPulling="2026-01-30 19:57:01.888177192 +0000 UTC m=+5198.156555227" lastFinishedPulling="2026-01-30 19:57:02.725183998 +0000 UTC m=+5198.993562043" observedRunningTime="2026-01-30 19:57:03.648783405 +0000 UTC m=+5199.917161460" watchObservedRunningTime="2026-01-30 19:57:03.653203344 +0000 UTC m=+5199.921581409" Jan 30 19:57:19 crc kubenswrapper[4782]: I0130 19:57:19.793452 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:57:19 crc kubenswrapper[4782]: I0130 19:57:19.794325 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.708852 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9b9gz/must-gather-2fb9w"] Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.711610 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.717569 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9b9gz"/"default-dockercfg-gx4fv" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.717720 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9b9gz"/"kube-root-ca.crt" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.717889 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9b9gz"/"openshift-service-ca.crt" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.737095 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9b9gz/must-gather-2fb9w"] Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.812341 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.812656 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk45d\" (UniqueName: \"kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.914476 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.914718 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk45d\" (UniqueName: \"kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.914969 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:26 crc kubenswrapper[4782]: I0130 19:57:26.942388 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk45d\" (UniqueName: \"kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d\") pod \"must-gather-2fb9w\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:27 crc kubenswrapper[4782]: I0130 19:57:27.030111 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 19:57:27 crc kubenswrapper[4782]: I0130 19:57:27.546471 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9b9gz/must-gather-2fb9w"] Jan 30 19:57:27 crc kubenswrapper[4782]: I0130 19:57:27.922831 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" event={"ID":"ec9dc11b-a597-42b6-859c-1987db75b2d0","Type":"ContainerStarted","Data":"6cd6e5017dc658c59bbcef0d3dd58ecc27ddd9fc89df5dc126cd8041fbcc4924"} Jan 30 19:57:36 crc kubenswrapper[4782]: I0130 19:57:36.007931 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" event={"ID":"ec9dc11b-a597-42b6-859c-1987db75b2d0","Type":"ContainerStarted","Data":"c06df32bbcf2b79131bbd4d7ce77be91743c3a0d7ddd07f7c97fa357e42f7050"} Jan 30 19:57:36 crc kubenswrapper[4782]: I0130 19:57:36.008697 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" event={"ID":"ec9dc11b-a597-42b6-859c-1987db75b2d0","Type":"ContainerStarted","Data":"faa53245cf537f46902137d0951d27d0b3c87121b856bd61b49b393ff7d7d2af"} Jan 30 19:57:36 crc kubenswrapper[4782]: I0130 19:57:36.041042 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" podStartSLOduration=2.627708043 podStartE2EDuration="10.041019111s" podCreationTimestamp="2026-01-30 19:57:26 +0000 UTC" firstStartedPulling="2026-01-30 19:57:27.523131004 +0000 UTC m=+5223.791509029" lastFinishedPulling="2026-01-30 19:57:34.936442032 +0000 UTC m=+5231.204820097" observedRunningTime="2026-01-30 19:57:36.026314248 +0000 UTC m=+5232.294692283" watchObservedRunningTime="2026-01-30 19:57:36.041019111 +0000 UTC m=+5232.309397146" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.037902 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-99ksc"] Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.039922 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.098098 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msb8k\" (UniqueName: \"kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.098381 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.200083 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.200187 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msb8k\" (UniqueName: \"kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.200511 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.222586 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msb8k\" (UniqueName: \"kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k\") pod \"crc-debug-99ksc\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:39 crc kubenswrapper[4782]: I0130 19:57:39.363819 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:57:40 crc kubenswrapper[4782]: I0130 19:57:40.048358 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" event={"ID":"2dcfc5ff-4954-455e-8a53-948fe3351c28","Type":"ContainerStarted","Data":"61e89d32201338bf3fd4e460356e16c6659d3aefbb3deb7341aa11a314cfc676"} Jan 30 19:57:49 crc kubenswrapper[4782]: I0130 19:57:49.792608 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:57:49 crc kubenswrapper[4782]: I0130 19:57:49.793117 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:57:50 crc kubenswrapper[4782]: I0130 19:57:50.144145 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" event={"ID":"2dcfc5ff-4954-455e-8a53-948fe3351c28","Type":"ContainerStarted","Data":"37f8730c3e188a73b722aa4cd8a36e84dbf51dfabe0668285ce22b6ad9a486a4"} Jan 30 19:57:50 crc kubenswrapper[4782]: I0130 19:57:50.167592 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" podStartSLOduration=1.057533733 podStartE2EDuration="11.167576642s" podCreationTimestamp="2026-01-30 19:57:39 +0000 UTC" firstStartedPulling="2026-01-30 19:57:39.436415517 +0000 UTC m=+5235.704793542" lastFinishedPulling="2026-01-30 19:57:49.546458426 +0000 UTC m=+5245.814836451" observedRunningTime="2026-01-30 19:57:50.164542497 +0000 UTC m=+5246.432920522" watchObservedRunningTime="2026-01-30 19:57:50.167576642 +0000 UTC m=+5246.435954667" Jan 30 19:58:19 crc kubenswrapper[4782]: I0130 19:58:19.792346 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 19:58:19 crc kubenswrapper[4782]: I0130 19:58:19.792855 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 19:58:19 crc kubenswrapper[4782]: I0130 19:58:19.792907 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 19:58:19 crc kubenswrapper[4782]: I0130 19:58:19.793796 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 19:58:19 crc kubenswrapper[4782]: I0130 19:58:19.793851 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" gracePeriod=600 Jan 30 19:58:19 crc kubenswrapper[4782]: E0130 19:58:19.927442 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:58:20 crc kubenswrapper[4782]: I0130 19:58:20.432608 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" exitCode=0 Jan 30 19:58:20 crc kubenswrapper[4782]: I0130 19:58:20.432654 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df"} Jan 30 19:58:20 crc kubenswrapper[4782]: I0130 19:58:20.432684 4782 scope.go:117] "RemoveContainer" containerID="0178a5eea884b0ec0ebcf6643ae9e076589ac3a8877eb5696e68f365a644ff30" Jan 30 19:58:20 crc kubenswrapper[4782]: I0130 19:58:20.433607 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:58:20 crc kubenswrapper[4782]: E0130 19:58:20.434029 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:58:32 crc kubenswrapper[4782]: I0130 19:58:32.410761 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:58:32 crc kubenswrapper[4782]: E0130 19:58:32.411601 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:58:38 crc kubenswrapper[4782]: I0130 19:58:38.640795 4782 generic.go:334] "Generic (PLEG): container finished" podID="2dcfc5ff-4954-455e-8a53-948fe3351c28" containerID="37f8730c3e188a73b722aa4cd8a36e84dbf51dfabe0668285ce22b6ad9a486a4" exitCode=0 Jan 30 19:58:38 crc kubenswrapper[4782]: I0130 19:58:38.640889 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" event={"ID":"2dcfc5ff-4954-455e-8a53-948fe3351c28","Type":"ContainerDied","Data":"37f8730c3e188a73b722aa4cd8a36e84dbf51dfabe0668285ce22b6ad9a486a4"} Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.805289 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.853011 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-99ksc"] Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.864079 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-99ksc"] Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.944994 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msb8k\" (UniqueName: \"kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k\") pod \"2dcfc5ff-4954-455e-8a53-948fe3351c28\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.945295 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host\") pod \"2dcfc5ff-4954-455e-8a53-948fe3351c28\" (UID: \"2dcfc5ff-4954-455e-8a53-948fe3351c28\") " Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.945406 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host" (OuterVolumeSpecName: "host") pod "2dcfc5ff-4954-455e-8a53-948fe3351c28" (UID: "2dcfc5ff-4954-455e-8a53-948fe3351c28"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.945934 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dcfc5ff-4954-455e-8a53-948fe3351c28-host\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:39 crc kubenswrapper[4782]: I0130 19:58:39.953835 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k" (OuterVolumeSpecName: "kube-api-access-msb8k") pod "2dcfc5ff-4954-455e-8a53-948fe3351c28" (UID: "2dcfc5ff-4954-455e-8a53-948fe3351c28"). InnerVolumeSpecName "kube-api-access-msb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:58:40 crc kubenswrapper[4782]: I0130 19:58:40.050358 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msb8k\" (UniqueName: \"kubernetes.io/projected/2dcfc5ff-4954-455e-8a53-948fe3351c28-kube-api-access-msb8k\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:40 crc kubenswrapper[4782]: I0130 19:58:40.432508 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dcfc5ff-4954-455e-8a53-948fe3351c28" path="/var/lib/kubelet/pods/2dcfc5ff-4954-455e-8a53-948fe3351c28/volumes" Jan 30 19:58:40 crc kubenswrapper[4782]: I0130 19:58:40.664965 4782 scope.go:117] "RemoveContainer" containerID="37f8730c3e188a73b722aa4cd8a36e84dbf51dfabe0668285ce22b6ad9a486a4" Jan 30 19:58:40 crc kubenswrapper[4782]: I0130 19:58:40.665365 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-99ksc" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.045216 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-xh5d8"] Jan 30 19:58:41 crc kubenswrapper[4782]: E0130 19:58:41.046443 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcfc5ff-4954-455e-8a53-948fe3351c28" containerName="container-00" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.046494 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcfc5ff-4954-455e-8a53-948fe3351c28" containerName="container-00" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.046861 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcfc5ff-4954-455e-8a53-948fe3351c28" containerName="container-00" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.048100 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.174631 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wlv2\" (UniqueName: \"kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.174814 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.276582 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wlv2\" (UniqueName: \"kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.276645 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.276861 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.306896 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wlv2\" (UniqueName: \"kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2\") pod \"crc-debug-xh5d8\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.365450 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:41 crc kubenswrapper[4782]: W0130 19:58:41.397593 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fb76a8a_45f5_4f95_9d39_85c5cf91bb54.slice/crio-8611a2319ac09d1bd3c15506b143cf13c6cdfa6c3f39e9316e4e47b876a300d5 WatchSource:0}: Error finding container 8611a2319ac09d1bd3c15506b143cf13c6cdfa6c3f39e9316e4e47b876a300d5: Status 404 returned error can't find the container with id 8611a2319ac09d1bd3c15506b143cf13c6cdfa6c3f39e9316e4e47b876a300d5 Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.677059 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" event={"ID":"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54","Type":"ContainerStarted","Data":"e9a8bc49f7c05a289bc5b6ddf0d772b6fca309fe9abd822d73b1f165c2394491"} Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.677126 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" event={"ID":"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54","Type":"ContainerStarted","Data":"8611a2319ac09d1bd3c15506b143cf13c6cdfa6c3f39e9316e4e47b876a300d5"} Jan 30 19:58:41 crc kubenswrapper[4782]: I0130 19:58:41.704295 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" podStartSLOduration=0.704277201 podStartE2EDuration="704.277201ms" podCreationTimestamp="2026-01-30 19:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 19:58:41.693946716 +0000 UTC m=+5297.962324731" watchObservedRunningTime="2026-01-30 19:58:41.704277201 +0000 UTC m=+5297.972655226" Jan 30 19:58:42 crc kubenswrapper[4782]: I0130 19:58:42.685377 4782 generic.go:334] "Generic (PLEG): container finished" podID="6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" containerID="e9a8bc49f7c05a289bc5b6ddf0d772b6fca309fe9abd822d73b1f165c2394491" exitCode=0 Jan 30 19:58:42 crc kubenswrapper[4782]: I0130 19:58:42.685454 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" event={"ID":"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54","Type":"ContainerDied","Data":"e9a8bc49f7c05a289bc5b6ddf0d772b6fca309fe9abd822d73b1f165c2394491"} Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.808681 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.926323 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wlv2\" (UniqueName: \"kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2\") pod \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.926675 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host\") pod \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\" (UID: \"6fb76a8a-45f5-4f95-9d39-85c5cf91bb54\") " Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.926746 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host" (OuterVolumeSpecName: "host") pod "6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" (UID: "6fb76a8a-45f5-4f95-9d39-85c5cf91bb54"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.927296 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-host\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:43 crc kubenswrapper[4782]: I0130 19:58:43.947265 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2" (OuterVolumeSpecName: "kube-api-access-9wlv2") pod "6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" (UID: "6fb76a8a-45f5-4f95-9d39-85c5cf91bb54"). InnerVolumeSpecName "kube-api-access-9wlv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.028812 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wlv2\" (UniqueName: \"kubernetes.io/projected/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54-kube-api-access-9wlv2\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.247489 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-xh5d8"] Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.257672 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-xh5d8"] Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.424467 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" path="/var/lib/kubelet/pods/6fb76a8a-45f5-4f95-9d39-85c5cf91bb54/volumes" Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.720688 4782 scope.go:117] "RemoveContainer" containerID="e9a8bc49f7c05a289bc5b6ddf0d772b6fca309fe9abd822d73b1f165c2394491" Jan 30 19:58:44 crc kubenswrapper[4782]: I0130 19:58:44.720715 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-xh5d8" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.400296 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-9hj99"] Jan 30 19:58:45 crc kubenswrapper[4782]: E0130 19:58:45.400697 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" containerName="container-00" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.400709 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" containerName="container-00" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.400886 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fb76a8a-45f5-4f95-9d39-85c5cf91bb54" containerName="container-00" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.401513 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.563415 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.563515 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrktv\" (UniqueName: \"kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.665048 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.665431 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrktv\" (UniqueName: \"kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.665199 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.686043 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrktv\" (UniqueName: \"kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv\") pod \"crc-debug-9hj99\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: I0130 19:58:45.717811 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:45 crc kubenswrapper[4782]: W0130 19:58:45.747982 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db080d8_6852_4035_bf9d_c3a99b30730e.slice/crio-fb505d2c7f3e67c5dd55a6c21d8d480d9382a9a0ce0362bd1b63dfef68e9618f WatchSource:0}: Error finding container fb505d2c7f3e67c5dd55a6c21d8d480d9382a9a0ce0362bd1b63dfef68e9618f: Status 404 returned error can't find the container with id fb505d2c7f3e67c5dd55a6c21d8d480d9382a9a0ce0362bd1b63dfef68e9618f Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.411060 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:58:46 crc kubenswrapper[4782]: E0130 19:58:46.411585 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.741987 4782 generic.go:334] "Generic (PLEG): container finished" podID="1db080d8-6852-4035-bf9d-c3a99b30730e" containerID="6e028d8dfe8cd7f10a9c5602af372cc32bf1be86b624572f3f5277d485f3d30a" exitCode=0 Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.742037 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" event={"ID":"1db080d8-6852-4035-bf9d-c3a99b30730e","Type":"ContainerDied","Data":"6e028d8dfe8cd7f10a9c5602af372cc32bf1be86b624572f3f5277d485f3d30a"} Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.742699 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" event={"ID":"1db080d8-6852-4035-bf9d-c3a99b30730e","Type":"ContainerStarted","Data":"fb505d2c7f3e67c5dd55a6c21d8d480d9382a9a0ce0362bd1b63dfef68e9618f"} Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.805380 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-9hj99"] Jan 30 19:58:46 crc kubenswrapper[4782]: I0130 19:58:46.821486 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9b9gz/crc-debug-9hj99"] Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.869763 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.926248 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host\") pod \"1db080d8-6852-4035-bf9d-c3a99b30730e\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.926397 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host" (OuterVolumeSpecName: "host") pod "1db080d8-6852-4035-bf9d-c3a99b30730e" (UID: "1db080d8-6852-4035-bf9d-c3a99b30730e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.926554 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrktv\" (UniqueName: \"kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv\") pod \"1db080d8-6852-4035-bf9d-c3a99b30730e\" (UID: \"1db080d8-6852-4035-bf9d-c3a99b30730e\") " Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.927350 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1db080d8-6852-4035-bf9d-c3a99b30730e-host\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:47 crc kubenswrapper[4782]: I0130 19:58:47.935095 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv" (OuterVolumeSpecName: "kube-api-access-qrktv") pod "1db080d8-6852-4035-bf9d-c3a99b30730e" (UID: "1db080d8-6852-4035-bf9d-c3a99b30730e"). InnerVolumeSpecName "kube-api-access-qrktv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 19:58:48 crc kubenswrapper[4782]: I0130 19:58:48.029177 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrktv\" (UniqueName: \"kubernetes.io/projected/1db080d8-6852-4035-bf9d-c3a99b30730e-kube-api-access-qrktv\") on node \"crc\" DevicePath \"\"" Jan 30 19:58:48 crc kubenswrapper[4782]: I0130 19:58:48.425589 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db080d8-6852-4035-bf9d-c3a99b30730e" path="/var/lib/kubelet/pods/1db080d8-6852-4035-bf9d-c3a99b30730e/volumes" Jan 30 19:58:48 crc kubenswrapper[4782]: I0130 19:58:48.764010 4782 scope.go:117] "RemoveContainer" containerID="6e028d8dfe8cd7f10a9c5602af372cc32bf1be86b624572f3f5277d485f3d30a" Jan 30 19:58:48 crc kubenswrapper[4782]: I0130 19:58:48.764514 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/crc-debug-9hj99" Jan 30 19:58:57 crc kubenswrapper[4782]: I0130 19:58:57.411547 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:58:57 crc kubenswrapper[4782]: E0130 19:58:57.412349 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:59:11 crc kubenswrapper[4782]: I0130 19:59:11.411236 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:59:11 crc kubenswrapper[4782]: E0130 19:59:11.412045 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.275674 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-564c766f5d-2hhs6_5162dd27-124a-4e1c-8a8c-51c4e47fce04/barbican-api/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.413577 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-564c766f5d-2hhs6_5162dd27-124a-4e1c-8a8c-51c4e47fce04/barbican-api-log/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.481200 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d6457597d-9bs7l_0575e76f-c529-41f7-8b65-87ec77ec9614/barbican-keystone-listener/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.605597 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d6457597d-9bs7l_0575e76f-c529-41f7-8b65-87ec77ec9614/barbican-keystone-listener-log/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.727484 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d84b8b585-bfbrv_69354d0f-b465-419f-8fd1-b812a39312c5/barbican-worker/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.801558 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d84b8b585-bfbrv_69354d0f-b465-419f-8fd1-b812a39312c5/barbican-worker-log/0.log" Jan 30 19:59:16 crc kubenswrapper[4782]: I0130 19:59:16.878119 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj_f9e549bf-994f-46e6-9d42-72a655229b73/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.079402 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/proxy-httpd/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.133053 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/ceilometer-central-agent/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.151635 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/ceilometer-notification-agent/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.205404 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/sg-core/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.386709 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3d752ed-dc31-49b8-80ce-b3b94f07dcf3/cinder-api-log/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.676710 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2773f02f-26c1-4c26-a789-afc299bd11c1/probe/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.984986 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5ac7726e-05ca-4e51-99e2-cce317290a59/cinder-scheduler/0.log" Jan 30 19:59:17 crc kubenswrapper[4782]: I0130 19:59:17.993288 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2773f02f-26c1-4c26-a789-afc299bd11c1/cinder-backup/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.002804 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3d752ed-dc31-49b8-80ce-b3b94f07dcf3/cinder-api/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.092933 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5ac7726e-05ca-4e51-99e2-cce317290a59/probe/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.291785 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_eacda6b1-72d6-4a27-9aa5-c0b01309e9d9/probe/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.391649 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_eacda6b1-72d6-4a27-9aa5-c0b01309e9d9/cinder-volume/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.549404 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f0fe280a-4eaa-4dc5-8898-053826fd7131/probe/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.646052 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f0fe280a-4eaa-4dc5-8898-053826fd7131/cinder-volume/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.682491 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg_c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.838069 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk_f9396100-4e8e-4e30-af8c-82043b59d08d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:18 crc kubenswrapper[4782]: I0130 19:59:18.928679 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/init/0.log" Jan 30 19:59:19 crc kubenswrapper[4782]: I0130 19:59:19.052662 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/init/0.log" Jan 30 19:59:19 crc kubenswrapper[4782]: I0130 19:59:19.137256 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-58pjf_92905892-4424-4957-a945-eb130f92d03f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:19 crc kubenswrapper[4782]: I0130 19:59:19.235505 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/dnsmasq-dns/0.log" Jan 30 19:59:19 crc kubenswrapper[4782]: I0130 19:59:19.376688 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397/glance-httpd/0.log" Jan 30 19:59:19 crc kubenswrapper[4782]: I0130 19:59:19.418737 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397/glance-log/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.064423 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_974e57d1-5346-4863-a1e3-1b595eaa91b5/glance-log/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.117742 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_974e57d1-5346-4863-a1e3-1b595eaa91b5/glance-httpd/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.326777 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d67b5c94d-pwj69_53e414f7-9297-46fb-87b6-19ce7ee55758/horizon/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.403154 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zkx58_a4f9b344-67a5-4f16-99b1-d8402f3e44cb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.542069 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-z68rj_5b8169ab-3daf-43a7-a107-075317085df1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.875760 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496661-m5ftp_dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2/keystone-cron/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.955297 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2faa1c8b-e69c-4b72-bc58-0d1a5e032d52/kube-state-metrics/0.log" Jan 30 19:59:20 crc kubenswrapper[4782]: I0130 19:59:20.965617 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d67b5c94d-pwj69_53e414f7-9297-46fb-87b6-19ce7ee55758/horizon-log/0.log" Jan 30 19:59:21 crc kubenswrapper[4782]: I0130 19:59:21.147135 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5c6f877b5f-8gdbg_199910fe-a283-4898-bd2b-69b6e1b7266b/keystone-api/0.log" Jan 30 19:59:21 crc kubenswrapper[4782]: I0130 19:59:21.220768 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8cccm_31a7790e-b097-45c3-9088-5fc885e63ef8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:22 crc kubenswrapper[4782]: I0130 19:59:22.128736 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc_39dc3714-072a-4267-812c-49c2aa1efe2d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:22 crc kubenswrapper[4782]: I0130 19:59:22.191189 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695d477669-wlmct_ed784e83-4524-4c3f-8697-ea3821f297b1/neutron-httpd/0.log" Jan 30 19:59:22 crc kubenswrapper[4782]: I0130 19:59:22.204621 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695d477669-wlmct_ed784e83-4524-4c3f-8697-ea3821f297b1/neutron-api/0.log" Jan 30 19:59:22 crc kubenswrapper[4782]: I0130 19:59:22.729399 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_860e5849-ad0b-4f89-87db-b839441f0dd9/nova-cell0-conductor-conductor/0.log" Jan 30 19:59:23 crc kubenswrapper[4782]: I0130 19:59:23.027379 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_71afc4ce-765f-4c71-a76e-6a4eff2b553d/nova-cell1-conductor-conductor/0.log" Jan 30 19:59:23 crc kubenswrapper[4782]: I0130 19:59:23.276343 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_858bbcbd-4a47-42ee-a581-2b03ca45dcaa/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 19:59:23 crc kubenswrapper[4782]: I0130 19:59:23.540369 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-f8ss8_6e19180a-524d-4e70-8e9a-e72c69f07d7c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:23 crc kubenswrapper[4782]: I0130 19:59:23.715569 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20743691-4aeb-4b01-a442-5df58c830c02/nova-api-log/0.log" Jan 30 19:59:23 crc kubenswrapper[4782]: I0130 19:59:23.813923 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_efd2ef42-aeac-48dd-9e95-fd000381dbfa/nova-metadata-log/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.057791 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20743691-4aeb-4b01-a442-5df58c830c02/nova-api-api/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.269656 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/mysql-bootstrap/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.272313 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1caadc78-c45b-4e64-ae44-a6f96bb41126/nova-scheduler-scheduler/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.425841 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/mysql-bootstrap/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.488112 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/galera/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.620037 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/mysql-bootstrap/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.814271 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/mysql-bootstrap/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.859102 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/galera/0.log" Jan 30 19:59:24 crc kubenswrapper[4782]: I0130 19:59:24.992337 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8322d742-28bf-4eb4-ba33-8e37da0780f1/openstackclient/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.099625 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hcgcc_ed7f80e9-b13c-461c-b115-55b8ce9662dc/openstack-network-exporter/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.297111 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server-init/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.410801 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:59:25 crc kubenswrapper[4782]: E0130 19:59:25.411126 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.443771 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server-init/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.533782 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.718278 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_efd2ef42-aeac-48dd-9e95-fd000381dbfa/nova-metadata-metadata/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.734804 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pz2pk_91d457a1-1878-47f1-a1d3-eac450864978/ovn-controller/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.950330 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovs-vswitchd/0.log" Jan 30 19:59:25 crc kubenswrapper[4782]: I0130 19:59:25.951587 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qncfz_5e8ffe68-337e-40ee-a941-188e1bad9112/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.291598 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_547e7f64-963a-48bd-afa5-e908a3a716a2/ovn-northd/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.298969 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_547e7f64-963a-48bd-afa5-e908a3a716a2/openstack-network-exporter/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.397692 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5e12229-4958-47a9-9210-18fba05c1319/openstack-network-exporter/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.486137 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5e12229-4958-47a9-9210-18fba05c1319/ovsdbserver-nb/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.921949 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6123a5a8-5a6d-455c-9418-71d31b35e2f3/openstack-network-exporter/0.log" Jan 30 19:59:26 crc kubenswrapper[4782]: I0130 19:59:26.968682 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6123a5a8-5a6d-455c-9418-71d31b35e2f3/ovsdbserver-sb/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.215062 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/init-config-reloader/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.362455 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6889757c94-v7jr9_da3c4b41-c384-4983-a704-e63d44f1fed9/placement-api/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.404218 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/config-reloader/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.410452 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6889757c94-v7jr9_da3c4b41-c384-4983-a704-e63d44f1fed9/placement-log/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.465412 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/init-config-reloader/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.515159 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/prometheus/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.596360 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/thanos-sidecar/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.713531 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/setup-container/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.896611 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/setup-container/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.906940 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/setup-container/0.log" Jan 30 19:59:27 crc kubenswrapper[4782]: I0130 19:59:27.907710 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/rabbitmq/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.089116 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/setup-container/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.125806 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/rabbitmq/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.174515 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/setup-container/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.344861 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/setup-container/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.375479 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/rabbitmq/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.441251 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh_385f85fe-f3e6-4149-9241-ae72c3e9d52d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.540562 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7c3e9bb9-ed43-4499-88c1-2bde956a84b8/memcached/0.log" Jan 30 19:59:28 crc kubenswrapper[4782]: I0130 19:59:28.587842 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-42r9p_f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.276804 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79_77e26ddb-4b47-4b06-a390-76653b75c503/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.293545 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7st4m_f287af23-a5f5-4aa9-b9c2-9cd87fc26da3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.365626 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mvw5c_0cceb6ea-381a-4862-bbff-42f7ce3cbaf4/ssh-known-hosts-edpm-deployment/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.585023 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6989f95847-z8k6r_d86a7921-fdce-4a73-ad98-4dc1373c72e2/proxy-server/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.669899 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6989f95847-z8k6r_d86a7921-fdce-4a73-ad98-4dc1373c72e2/proxy-httpd/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.679290 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mbdqv_8a9f5c0e-8d43-437d-b47e-e72f03df077b/swift-ring-rebalance/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.799350 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-auditor/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.891792 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-reaper/0.log" Jan 30 19:59:29 crc kubenswrapper[4782]: I0130 19:59:29.919371 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-replicator/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.001291 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-server/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.026578 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-auditor/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.108940 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-replicator/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.143219 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-server/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.150070 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-updater/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.199660 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-auditor/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.263037 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-expirer/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.291259 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-replicator/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.335259 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-server/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.352791 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-updater/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.413656 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/rsync/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.442413 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/swift-recon-cron/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.564867 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt_0055025f-d7c7-4469-9791-ffcb0bbdfef4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.817122 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bd740cf2-1846-4d1e-902e-6ba7a54c0019/tempest-tests-tempest-tests-runner/0.log" Jan 30 19:59:30 crc kubenswrapper[4782]: I0130 19:59:30.869669 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8b2f36d4-09f5-46f6-9e28-f26004cc80bf/test-operator-logs-container/0.log" Jan 30 19:59:31 crc kubenswrapper[4782]: I0130 19:59:31.286912 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-klkrq_df904ca8-14f8-4f01-b67e-be59a86d4981/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 19:59:31 crc kubenswrapper[4782]: I0130 19:59:31.982320 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_1e15b60d-e1ab-4144-a82d-021b51750157/watcher-applier/0.log" Jan 30 19:59:32 crc kubenswrapper[4782]: I0130 19:59:32.486201 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_68609276-cd5e-43d1-bef5-c79ef0628d5b/watcher-api-log/0.log" Jan 30 19:59:34 crc kubenswrapper[4782]: I0130 19:59:34.567940 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_25e52062-f76c-4ebf-9738-8e5a9990aba9/watcher-decision-engine/0.log" Jan 30 19:59:35 crc kubenswrapper[4782]: I0130 19:59:35.267843 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_68609276-cd5e-43d1-bef5-c79ef0628d5b/watcher-api/0.log" Jan 30 19:59:39 crc kubenswrapper[4782]: I0130 19:59:39.411964 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:59:39 crc kubenswrapper[4782]: E0130 19:59:39.412939 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 19:59:54 crc kubenswrapper[4782]: I0130 19:59:54.425214 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 19:59:54 crc kubenswrapper[4782]: E0130 19:59:54.426476 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.175374 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v"] Jan 30 20:00:00 crc kubenswrapper[4782]: E0130 20:00:00.176356 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db080d8-6852-4035-bf9d-c3a99b30730e" containerName="container-00" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.176371 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db080d8-6852-4035-bf9d-c3a99b30730e" containerName="container-00" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.176555 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db080d8-6852-4035-bf9d-c3a99b30730e" containerName="container-00" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.177312 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.179614 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.179906 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.205921 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.205997 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.206579 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.208285 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v"] Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.308358 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.308641 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.308739 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.310485 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.320837 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.326179 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b\") pod \"collect-profiles-29496720-7s28v\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:00 crc kubenswrapper[4782]: I0130 20:00:00.505410 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:01 crc kubenswrapper[4782]: I0130 20:00:01.069967 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v"] Jan 30 20:00:01 crc kubenswrapper[4782]: W0130 20:00:01.286119 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3225dba1_c4c4_4a63_8f87_b0c67fb68c43.slice/crio-ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e WatchSource:0}: Error finding container ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e: Status 404 returned error can't find the container with id ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e Jan 30 20:00:01 crc kubenswrapper[4782]: I0130 20:00:01.454628 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" event={"ID":"3225dba1-c4c4-4a63-8f87-b0c67fb68c43","Type":"ContainerStarted","Data":"ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e"} Jan 30 20:00:02 crc kubenswrapper[4782]: I0130 20:00:02.463068 4782 generic.go:334] "Generic (PLEG): container finished" podID="3225dba1-c4c4-4a63-8f87-b0c67fb68c43" containerID="59f59fd211690e74122bb8df07d25b9b869cf0cdeba196d2dc6114911faa647b" exitCode=0 Jan 30 20:00:02 crc kubenswrapper[4782]: I0130 20:00:02.463217 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" event={"ID":"3225dba1-c4c4-4a63-8f87-b0c67fb68c43","Type":"ContainerDied","Data":"59f59fd211690e74122bb8df07d25b9b869cf0cdeba196d2dc6114911faa647b"} Jan 30 20:00:02 crc kubenswrapper[4782]: I0130 20:00:02.719473 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-fmn9l_c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828/manager/0.log" Jan 30 20:00:02 crc kubenswrapper[4782]: I0130 20:00:02.970691 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-wmvtm_9517a543-a9e5-4253-a1b1-4154cf20a70a/manager/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.046759 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-txtbj_d82d84b6-3009-480d-b614-fbd420d90f0e/manager/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.199206 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.608277 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.609404 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.635433 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.884123 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.888486 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.952513 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.958906 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/extract/0.log" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.983322 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume\") pod \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.983654 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume\") pod \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.983784 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b\") pod \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\" (UID: \"3225dba1-c4c4-4a63-8f87-b0c67fb68c43\") " Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.984259 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume" (OuterVolumeSpecName: "config-volume") pod "3225dba1-c4c4-4a63-8f87-b0c67fb68c43" (UID: "3225dba1-c4c4-4a63-8f87-b0c67fb68c43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.984594 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 20:00:03 crc kubenswrapper[4782]: I0130 20:00:03.990478 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b" (OuterVolumeSpecName: "kube-api-access-48f8b") pod "3225dba1-c4c4-4a63-8f87-b0c67fb68c43" (UID: "3225dba1-c4c4-4a63-8f87-b0c67fb68c43"). InnerVolumeSpecName "kube-api-access-48f8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.004892 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3225dba1-c4c4-4a63-8f87-b0c67fb68c43" (UID: "3225dba1-c4c4-4a63-8f87-b0c67fb68c43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.085308 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.085346 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48f8b\" (UniqueName: \"kubernetes.io/projected/3225dba1-c4c4-4a63-8f87-b0c67fb68c43-kube-api-access-48f8b\") on node \"crc\" DevicePath \"\"" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.157769 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-v85zd_f55acdec-57ab-4e5d-97df-ac13e7b749da/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.183108 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-gdx26_f03fb99f-3277-4bff-bcd2-93756326af54/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.353579 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-745gl_cd676b0f-9e48-461d-8381-998645228b54/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.479681 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" event={"ID":"3225dba1-c4c4-4a63-8f87-b0c67fb68c43","Type":"ContainerDied","Data":"ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e"} Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.479721 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ced3dab4e226ac02a1ac6b52438713a3e4f1203a10967cf2f48e924c2117dd4e" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.479776 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496720-7s28v" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.607018 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-h4kqr_1cb2fc09-3cbc-4cee-8a31-04a050d8ff04/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.715758 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-fg8bm_8b27955a-e2c6-43eb-953e-af3d66a687e3/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.866702 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-4jb22_2ca6290f-bb8e-484d-84bd-d9e66b9f1471/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.898355 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-ffbz8_8ec19937-0358-40cb-9fc0-de54ba844b62/manager/0.log" Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.965678 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf"] Jan 30 20:00:04 crc kubenswrapper[4782]: I0130 20:00:04.975270 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496675-f8nsf"] Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.136048 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-45lw5_79d06938-56c3-4ec4-a455-0fde260d8cdd/manager/0.log" Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.188895 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-94rpc_ccfec61d-1461-4d91-a834-3170c98cf92f/manager/0.log" Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.396446 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-8phk5_0301eb58-f901-4952-9f7e-7764c0e67d7f/manager/0.log" Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.445497 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-78nps_5a54baf5-b3a2-4417-8caf-8fe321ff5f5f/manager/0.log" Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.549092 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4d524dn_acd35126-a27d-4b4c-b56b-04ebd8358c74/manager/0.log" Jan 30 20:00:05 crc kubenswrapper[4782]: I0130 20:00:05.769613 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-678fbb89d4-gxzc4_48f3d327-7068-48e5-bd16-e8983d7dce53/operator/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.000005 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7mq6_09760161-4b39-4185-9c1e-917ba1924171/registry-server/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.278401 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-vsmt2_c212e215-248f-4b93-9a70-b352f425648c/manager/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.281067 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-lqrfz_097a5bf9-6be5-4d4e-9547-f1318371e9db/manager/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.440573 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bfc7a2-8336-4871-b013-9a7e1212ea8a" path="/var/lib/kubelet/pods/23bfc7a2-8336-4871-b013-9a7e1212ea8a/volumes" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.527023 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-rmkdk_52ac09b6-ec41-4ebc-ac18-018794fab085/manager/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.564604 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8jcf8_fcdbdda2-62ba-4df8-9885-78c31d1e6157/operator/0.log" Jan 30 20:00:06 crc kubenswrapper[4782]: I0130 20:00:06.794338 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-brw4k_1794e6a9-01aa-43b7-841d-ca7bc24950f8/manager/0.log" Jan 30 20:00:07 crc kubenswrapper[4782]: I0130 20:00:07.001507 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-khwrr_f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3/manager/0.log" Jan 30 20:00:07 crc kubenswrapper[4782]: I0130 20:00:07.088202 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-78c8444fdd-928lz_a765979e-db86-4d07-8a0a-96c61d42137c/manager/0.log" Jan 30 20:00:07 crc kubenswrapper[4782]: I0130 20:00:07.185989 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-857dcb78d6-4vgqm_ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c/manager/0.log" Jan 30 20:00:09 crc kubenswrapper[4782]: I0130 20:00:09.410940 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:00:09 crc kubenswrapper[4782]: E0130 20:00:09.411800 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:00:20 crc kubenswrapper[4782]: I0130 20:00:20.410934 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:00:20 crc kubenswrapper[4782]: E0130 20:00:20.411780 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:00:33 crc kubenswrapper[4782]: I0130 20:00:33.038007 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-df42d_dd0f947d-ef9a-43ea-a5a0-7fe20d429739/control-plane-machine-set-operator/0.log" Jan 30 20:00:33 crc kubenswrapper[4782]: I0130 20:00:33.246270 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kr9hs_3b9097e3-f69b-49ae-9781-52921de78625/kube-rbac-proxy/0.log" Jan 30 20:00:33 crc kubenswrapper[4782]: I0130 20:00:33.285486 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kr9hs_3b9097e3-f69b-49ae-9781-52921de78625/machine-api-operator/0.log" Jan 30 20:00:34 crc kubenswrapper[4782]: I0130 20:00:34.419157 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:00:34 crc kubenswrapper[4782]: E0130 20:00:34.419444 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:00:36 crc kubenswrapper[4782]: I0130 20:00:36.833507 4782 scope.go:117] "RemoveContainer" containerID="84df9e8415c13230b6fab6bbf8dad240f12410e21f10cc104e0b880bb4f5223c" Jan 30 20:00:46 crc kubenswrapper[4782]: I0130 20:00:46.411211 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:00:46 crc kubenswrapper[4782]: E0130 20:00:46.412276 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:00:48 crc kubenswrapper[4782]: I0130 20:00:48.488747 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-wspn9_afb307de-3731-434f-bbf7-3f8fcd8cd336/cert-manager-controller/0.log" Jan 30 20:00:48 crc kubenswrapper[4782]: I0130 20:00:48.607451 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8hdgf_bd00add1-aab0-4229-837f-7f79d71ad160/cert-manager-cainjector/0.log" Jan 30 20:00:48 crc kubenswrapper[4782]: I0130 20:00:48.664399 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s26rq_f8175d03-4ab5-4ed7-ab43-c722ef6a33b3/cert-manager-webhook/0.log" Jan 30 20:00:57 crc kubenswrapper[4782]: I0130 20:00:57.411267 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:00:57 crc kubenswrapper[4782]: E0130 20:00:57.412135 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.159040 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29496721-qdvsh"] Jan 30 20:01:00 crc kubenswrapper[4782]: E0130 20:01:00.162976 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3225dba1-c4c4-4a63-8f87-b0c67fb68c43" containerName="collect-profiles" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.163010 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="3225dba1-c4c4-4a63-8f87-b0c67fb68c43" containerName="collect-profiles" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.163398 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="3225dba1-c4c4-4a63-8f87-b0c67fb68c43" containerName="collect-profiles" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.164660 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.169937 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496721-qdvsh"] Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.315615 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.315657 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.315758 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w474\" (UniqueName: \"kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.315821 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.417552 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w474\" (UniqueName: \"kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.417619 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.417675 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.417694 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.426456 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.426996 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.428858 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.439046 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w474\" (UniqueName: \"kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474\") pod \"keystone-cron-29496721-qdvsh\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:00 crc kubenswrapper[4782]: I0130 20:01:00.546064 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:01 crc kubenswrapper[4782]: I0130 20:01:01.036077 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496721-qdvsh"] Jan 30 20:01:01 crc kubenswrapper[4782]: I0130 20:01:01.322352 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496721-qdvsh" event={"ID":"87f9f5f2-88b7-4417-adac-110906eeceb5","Type":"ContainerStarted","Data":"5213ae644a7f82ab98048bc852fc3b930fe79ae0a75aff60cb715f6f4903b31e"} Jan 30 20:01:01 crc kubenswrapper[4782]: I0130 20:01:01.322701 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496721-qdvsh" event={"ID":"87f9f5f2-88b7-4417-adac-110906eeceb5","Type":"ContainerStarted","Data":"8aa5050c21e44ddd34de9780b6552de4e0ccd0ba2d934df2ba087b377ec5d7d5"} Jan 30 20:01:01 crc kubenswrapper[4782]: I0130 20:01:01.347539 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29496721-qdvsh" podStartSLOduration=1.347523851 podStartE2EDuration="1.347523851s" podCreationTimestamp="2026-01-30 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 20:01:01.33815923 +0000 UTC m=+5437.606537255" watchObservedRunningTime="2026-01-30 20:01:01.347523851 +0000 UTC m=+5437.615901876" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.785686 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.788089 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.822196 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.869636 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.869945 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hpt5\" (UniqueName: \"kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.870361 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.971873 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.972055 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hpt5\" (UniqueName: \"kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.972214 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.972613 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:02 crc kubenswrapper[4782]: I0130 20:01:02.972715 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:03 crc kubenswrapper[4782]: I0130 20:01:03.005032 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hpt5\" (UniqueName: \"kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5\") pod \"redhat-marketplace-t8dz7\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:03 crc kubenswrapper[4782]: I0130 20:01:03.113635 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:03 crc kubenswrapper[4782]: I0130 20:01:03.627847 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:04 crc kubenswrapper[4782]: I0130 20:01:04.349636 4782 generic.go:334] "Generic (PLEG): container finished" podID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerID="cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3" exitCode=0 Jan 30 20:01:04 crc kubenswrapper[4782]: I0130 20:01:04.349755 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerDied","Data":"cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3"} Jan 30 20:01:04 crc kubenswrapper[4782]: I0130 20:01:04.350015 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerStarted","Data":"30eaf3a104bf3c8cca7a1e2456ce875a5aba9964ae88c74de3a4bc09f7850fae"} Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.275373 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z6fgp_42d8b05a-8142-462f-b3ad-e496c30e8eea/nmstate-console-plugin/0.log" Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.283683 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2blvc_61543235-f4f6-4320-b2ef-11521d91d360/nmstate-handler/0.log" Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.359785 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerStarted","Data":"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425"} Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.426561 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5bs9t_8bab1b5d-f025-4df0-ba3c-d406621dd5ac/kube-rbac-proxy/0.log" Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.502112 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5bs9t_8bab1b5d-f025-4df0-ba3c-d406621dd5ac/nmstate-metrics/0.log" Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.590534 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-prp59_37a25f92-459c-447c-846b-bfd73a950907/nmstate-operator/0.log" Jan 30 20:01:05 crc kubenswrapper[4782]: I0130 20:01:05.708102 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-hsdpj_7022f3b6-d4c1-4b83-b541-2125a53e701c/nmstate-webhook/0.log" Jan 30 20:01:06 crc kubenswrapper[4782]: I0130 20:01:06.396338 4782 generic.go:334] "Generic (PLEG): container finished" podID="87f9f5f2-88b7-4417-adac-110906eeceb5" containerID="5213ae644a7f82ab98048bc852fc3b930fe79ae0a75aff60cb715f6f4903b31e" exitCode=0 Jan 30 20:01:06 crc kubenswrapper[4782]: I0130 20:01:06.396414 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496721-qdvsh" event={"ID":"87f9f5f2-88b7-4417-adac-110906eeceb5","Type":"ContainerDied","Data":"5213ae644a7f82ab98048bc852fc3b930fe79ae0a75aff60cb715f6f4903b31e"} Jan 30 20:01:06 crc kubenswrapper[4782]: I0130 20:01:06.400067 4782 generic.go:334] "Generic (PLEG): container finished" podID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerID="a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425" exitCode=0 Jan 30 20:01:06 crc kubenswrapper[4782]: I0130 20:01:06.400110 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerDied","Data":"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425"} Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.415018 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerStarted","Data":"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417"} Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.437320 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t8dz7" podStartSLOduration=2.964729261 podStartE2EDuration="5.437296519s" podCreationTimestamp="2026-01-30 20:01:02 +0000 UTC" firstStartedPulling="2026-01-30 20:01:04.352132735 +0000 UTC m=+5440.620510760" lastFinishedPulling="2026-01-30 20:01:06.824699993 +0000 UTC m=+5443.093078018" observedRunningTime="2026-01-30 20:01:07.435197837 +0000 UTC m=+5443.703575862" watchObservedRunningTime="2026-01-30 20:01:07.437296519 +0000 UTC m=+5443.705674544" Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.897703 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.982419 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys\") pod \"87f9f5f2-88b7-4417-adac-110906eeceb5\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.982646 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle\") pod \"87f9f5f2-88b7-4417-adac-110906eeceb5\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.983316 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w474\" (UniqueName: \"kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474\") pod \"87f9f5f2-88b7-4417-adac-110906eeceb5\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.983432 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data\") pod \"87f9f5f2-88b7-4417-adac-110906eeceb5\" (UID: \"87f9f5f2-88b7-4417-adac-110906eeceb5\") " Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.988976 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474" (OuterVolumeSpecName: "kube-api-access-5w474") pod "87f9f5f2-88b7-4417-adac-110906eeceb5" (UID: "87f9f5f2-88b7-4417-adac-110906eeceb5"). InnerVolumeSpecName "kube-api-access-5w474". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:01:07 crc kubenswrapper[4782]: I0130 20:01:07.997369 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "87f9f5f2-88b7-4417-adac-110906eeceb5" (UID: "87f9f5f2-88b7-4417-adac-110906eeceb5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.014314 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87f9f5f2-88b7-4417-adac-110906eeceb5" (UID: "87f9f5f2-88b7-4417-adac-110906eeceb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.055545 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data" (OuterVolumeSpecName: "config-data") pod "87f9f5f2-88b7-4417-adac-110906eeceb5" (UID: "87f9f5f2-88b7-4417-adac-110906eeceb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.086291 4782 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.086617 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w474\" (UniqueName: \"kubernetes.io/projected/87f9f5f2-88b7-4417-adac-110906eeceb5-kube-api-access-5w474\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.086682 4782 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.086748 4782 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87f9f5f2-88b7-4417-adac-110906eeceb5-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.441812 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496721-qdvsh" Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.447374 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496721-qdvsh" event={"ID":"87f9f5f2-88b7-4417-adac-110906eeceb5","Type":"ContainerDied","Data":"8aa5050c21e44ddd34de9780b6552de4e0ccd0ba2d934df2ba087b377ec5d7d5"} Jan 30 20:01:08 crc kubenswrapper[4782]: I0130 20:01:08.447531 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa5050c21e44ddd34de9780b6552de4e0ccd0ba2d934df2ba087b377ec5d7d5" Jan 30 20:01:11 crc kubenswrapper[4782]: I0130 20:01:11.411331 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:01:11 crc kubenswrapper[4782]: E0130 20:01:11.412309 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:01:13 crc kubenswrapper[4782]: I0130 20:01:13.114116 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:13 crc kubenswrapper[4782]: I0130 20:01:13.115561 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:13 crc kubenswrapper[4782]: I0130 20:01:13.197747 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:13 crc kubenswrapper[4782]: I0130 20:01:13.528312 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:13 crc kubenswrapper[4782]: I0130 20:01:13.583011 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:15 crc kubenswrapper[4782]: I0130 20:01:15.497348 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t8dz7" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="registry-server" containerID="cri-o://f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417" gracePeriod=2 Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.024859 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.058834 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities\") pod \"ac4e4364-e7a0-4369-895d-c934fb123c90\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.059113 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hpt5\" (UniqueName: \"kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5\") pod \"ac4e4364-e7a0-4369-895d-c934fb123c90\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.059207 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content\") pod \"ac4e4364-e7a0-4369-895d-c934fb123c90\" (UID: \"ac4e4364-e7a0-4369-895d-c934fb123c90\") " Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.060025 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities" (OuterVolumeSpecName: "utilities") pod "ac4e4364-e7a0-4369-895d-c934fb123c90" (UID: "ac4e4364-e7a0-4369-895d-c934fb123c90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.071441 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5" (OuterVolumeSpecName: "kube-api-access-6hpt5") pod "ac4e4364-e7a0-4369-895d-c934fb123c90" (UID: "ac4e4364-e7a0-4369-895d-c934fb123c90"). InnerVolumeSpecName "kube-api-access-6hpt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.087800 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac4e4364-e7a0-4369-895d-c934fb123c90" (UID: "ac4e4364-e7a0-4369-895d-c934fb123c90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.162203 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.162467 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4e4364-e7a0-4369-895d-c934fb123c90-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.162537 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hpt5\" (UniqueName: \"kubernetes.io/projected/ac4e4364-e7a0-4369-895d-c934fb123c90-kube-api-access-6hpt5\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.509351 4782 generic.go:334] "Generic (PLEG): container finished" podID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerID="f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417" exitCode=0 Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.509421 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8dz7" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.509443 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerDied","Data":"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417"} Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.510683 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8dz7" event={"ID":"ac4e4364-e7a0-4369-895d-c934fb123c90","Type":"ContainerDied","Data":"30eaf3a104bf3c8cca7a1e2456ce875a5aba9964ae88c74de3a4bc09f7850fae"} Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.510731 4782 scope.go:117] "RemoveContainer" containerID="f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.544183 4782 scope.go:117] "RemoveContainer" containerID="a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.554295 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.570737 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8dz7"] Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.584501 4782 scope.go:117] "RemoveContainer" containerID="cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.664798 4782 scope.go:117] "RemoveContainer" containerID="f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417" Jan 30 20:01:16 crc kubenswrapper[4782]: E0130 20:01:16.665535 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417\": container with ID starting with f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417 not found: ID does not exist" containerID="f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.665719 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417"} err="failed to get container status \"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417\": rpc error: code = NotFound desc = could not find container \"f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417\": container with ID starting with f89c5c65e7cdd05a150ab66b4a6a9bc39b291889d7419d8a681b8f5363cbf417 not found: ID does not exist" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.665834 4782 scope.go:117] "RemoveContainer" containerID="a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425" Jan 30 20:01:16 crc kubenswrapper[4782]: E0130 20:01:16.666379 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425\": container with ID starting with a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425 not found: ID does not exist" containerID="a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.666425 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425"} err="failed to get container status \"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425\": rpc error: code = NotFound desc = could not find container \"a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425\": container with ID starting with a5db8e1c439f3876e836523b56099f8e0a114a95e5434b19bc0e3cb938a5b425 not found: ID does not exist" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.666454 4782 scope.go:117] "RemoveContainer" containerID="cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3" Jan 30 20:01:16 crc kubenswrapper[4782]: E0130 20:01:16.666906 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3\": container with ID starting with cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3 not found: ID does not exist" containerID="cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3" Jan 30 20:01:16 crc kubenswrapper[4782]: I0130 20:01:16.667027 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3"} err="failed to get container status \"cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3\": rpc error: code = NotFound desc = could not find container \"cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3\": container with ID starting with cb7d8ed33abdce18800db549038565e927b1140af52c47fc2515cac7e85031d3 not found: ID does not exist" Jan 30 20:01:18 crc kubenswrapper[4782]: I0130 20:01:18.430489 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" path="/var/lib/kubelet/pods/ac4e4364-e7a0-4369-895d-c934fb123c90/volumes" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.060288 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:20 crc kubenswrapper[4782]: E0130 20:01:20.061720 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="extract-content" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.061811 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="extract-content" Jan 30 20:01:20 crc kubenswrapper[4782]: E0130 20:01:20.061870 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="extract-utilities" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.061920 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="extract-utilities" Jan 30 20:01:20 crc kubenswrapper[4782]: E0130 20:01:20.061984 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="registry-server" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.062034 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="registry-server" Jan 30 20:01:20 crc kubenswrapper[4782]: E0130 20:01:20.062112 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f9f5f2-88b7-4417-adac-110906eeceb5" containerName="keystone-cron" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.062164 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f9f5f2-88b7-4417-adac-110906eeceb5" containerName="keystone-cron" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.062510 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f9f5f2-88b7-4417-adac-110906eeceb5" containerName="keystone-cron" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.062580 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4e4364-e7a0-4369-895d-c934fb123c90" containerName="registry-server" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.064003 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.072477 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.158168 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.158611 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgn4\" (UniqueName: \"kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.158741 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.260985 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.261134 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgn4\" (UniqueName: \"kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.261164 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.261477 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.262122 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.298104 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgn4\" (UniqueName: \"kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4\") pod \"certified-operators-nww7g\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.403451 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:20 crc kubenswrapper[4782]: I0130 20:01:20.976580 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:21 crc kubenswrapper[4782]: I0130 20:01:21.577056 4782 generic.go:334] "Generic (PLEG): container finished" podID="54578c77-450f-479f-90f5-69a91117ff17" containerID="119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005" exitCode=0 Jan 30 20:01:21 crc kubenswrapper[4782]: I0130 20:01:21.577099 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerDied","Data":"119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005"} Jan 30 20:01:21 crc kubenswrapper[4782]: I0130 20:01:21.577124 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerStarted","Data":"b541f570b6faf84965c5a50a1c1edb1ed83bd9c76abe508382b6c3fe2bfe9111"} Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.474068 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.477351 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.487028 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.507655 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.507753 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhg7k\" (UniqueName: \"kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.507789 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.585740 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerStarted","Data":"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666"} Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.609586 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.609692 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhg7k\" (UniqueName: \"kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.609728 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.610129 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.610157 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.629169 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhg7k\" (UniqueName: \"kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k\") pod \"redhat-operators-dx47j\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:22 crc kubenswrapper[4782]: I0130 20:01:22.809676 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.321290 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.483782 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kmgbq_92e82803-8b7d-46f3-ba40-2900590261cf/prometheus-operator/0.log" Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.611915 4782 generic.go:334] "Generic (PLEG): container finished" podID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerID="a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740" exitCode=0 Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.612000 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerDied","Data":"a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740"} Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.612050 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerStarted","Data":"35879f435bfd05ad023bbce9c9f8b6c43139cc204f9a563de5d729dc9fb0fe54"} Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.699398 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22_19b18d8a-aa0f-494e-9e56-55bceba788c6/prometheus-operator-admission-webhook/0.log" Jan 30 20:01:23 crc kubenswrapper[4782]: I0130 20:01:23.739984 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp_05098fbb-e910-4fec-8a31-fd98d476b941/prometheus-operator-admission-webhook/0.log" Jan 30 20:01:24 crc kubenswrapper[4782]: I0130 20:01:24.077891 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w89kr_616c3ea8-075a-475f-9896-180a02e4cc3f/perses-operator/0.log" Jan 30 20:01:24 crc kubenswrapper[4782]: I0130 20:01:24.143908 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pcglc_786ed08c-6b06-4e44-aaf4-5562ef433b88/operator/0.log" Jan 30 20:01:24 crc kubenswrapper[4782]: I0130 20:01:24.622886 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerStarted","Data":"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b"} Jan 30 20:01:24 crc kubenswrapper[4782]: I0130 20:01:24.625660 4782 generic.go:334] "Generic (PLEG): container finished" podID="54578c77-450f-479f-90f5-69a91117ff17" containerID="b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666" exitCode=0 Jan 30 20:01:24 crc kubenswrapper[4782]: I0130 20:01:24.625715 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerDied","Data":"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666"} Jan 30 20:01:25 crc kubenswrapper[4782]: I0130 20:01:25.410851 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:01:25 crc kubenswrapper[4782]: E0130 20:01:25.411713 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:01:25 crc kubenswrapper[4782]: I0130 20:01:25.641695 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerStarted","Data":"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea"} Jan 30 20:01:25 crc kubenswrapper[4782]: I0130 20:01:25.670632 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nww7g" podStartSLOduration=1.971947651 podStartE2EDuration="5.670615253s" podCreationTimestamp="2026-01-30 20:01:20 +0000 UTC" firstStartedPulling="2026-01-30 20:01:21.579824283 +0000 UTC m=+5457.848202308" lastFinishedPulling="2026-01-30 20:01:25.278491885 +0000 UTC m=+5461.546869910" observedRunningTime="2026-01-30 20:01:25.661671182 +0000 UTC m=+5461.930049247" watchObservedRunningTime="2026-01-30 20:01:25.670615253 +0000 UTC m=+5461.938993278" Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.403621 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.404266 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.470808 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.710378 4782 generic.go:334] "Generic (PLEG): container finished" podID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerID="bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b" exitCode=0 Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.711837 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerDied","Data":"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b"} Jan 30 20:01:30 crc kubenswrapper[4782]: I0130 20:01:30.763208 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:31 crc kubenswrapper[4782]: I0130 20:01:31.718027 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:31 crc kubenswrapper[4782]: I0130 20:01:31.723018 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerStarted","Data":"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9"} Jan 30 20:01:32 crc kubenswrapper[4782]: I0130 20:01:32.731711 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nww7g" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="registry-server" containerID="cri-o://7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea" gracePeriod=2 Jan 30 20:01:32 crc kubenswrapper[4782]: I0130 20:01:32.810294 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:32 crc kubenswrapper[4782]: I0130 20:01:32.810372 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.240652 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.259126 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dx47j" podStartSLOduration=3.728900653 podStartE2EDuration="11.259108399s" podCreationTimestamp="2026-01-30 20:01:22 +0000 UTC" firstStartedPulling="2026-01-30 20:01:23.613388205 +0000 UTC m=+5459.881766230" lastFinishedPulling="2026-01-30 20:01:31.143595951 +0000 UTC m=+5467.411973976" observedRunningTime="2026-01-30 20:01:31.754984197 +0000 UTC m=+5468.023362262" watchObservedRunningTime="2026-01-30 20:01:33.259108399 +0000 UTC m=+5469.527486424" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.389704 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities\") pod \"54578c77-450f-479f-90f5-69a91117ff17\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.390001 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rgn4\" (UniqueName: \"kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4\") pod \"54578c77-450f-479f-90f5-69a91117ff17\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.390068 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content\") pod \"54578c77-450f-479f-90f5-69a91117ff17\" (UID: \"54578c77-450f-479f-90f5-69a91117ff17\") " Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.390453 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities" (OuterVolumeSpecName: "utilities") pod "54578c77-450f-479f-90f5-69a91117ff17" (UID: "54578c77-450f-479f-90f5-69a91117ff17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.405380 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4" (OuterVolumeSpecName: "kube-api-access-5rgn4") pod "54578c77-450f-479f-90f5-69a91117ff17" (UID: "54578c77-450f-479f-90f5-69a91117ff17"). InnerVolumeSpecName "kube-api-access-5rgn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.442342 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54578c77-450f-479f-90f5-69a91117ff17" (UID: "54578c77-450f-479f-90f5-69a91117ff17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.491864 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rgn4\" (UniqueName: \"kubernetes.io/projected/54578c77-450f-479f-90f5-69a91117ff17-kube-api-access-5rgn4\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.491911 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.491921 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54578c77-450f-479f-90f5-69a91117ff17-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.744402 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nww7g" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.744456 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerDied","Data":"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea"} Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.744511 4782 scope.go:117] "RemoveContainer" containerID="7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.748326 4782 generic.go:334] "Generic (PLEG): container finished" podID="54578c77-450f-479f-90f5-69a91117ff17" containerID="7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea" exitCode=0 Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.748431 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nww7g" event={"ID":"54578c77-450f-479f-90f5-69a91117ff17","Type":"ContainerDied","Data":"b541f570b6faf84965c5a50a1c1edb1ed83bd9c76abe508382b6c3fe2bfe9111"} Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.776030 4782 scope.go:117] "RemoveContainer" containerID="b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.793647 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.803530 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nww7g"] Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.819398 4782 scope.go:117] "RemoveContainer" containerID="119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.868297 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dx47j" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="registry-server" probeResult="failure" output=< Jan 30 20:01:33 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 20:01:33 crc kubenswrapper[4782]: > Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.875572 4782 scope.go:117] "RemoveContainer" containerID="7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea" Jan 30 20:01:33 crc kubenswrapper[4782]: E0130 20:01:33.876052 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea\": container with ID starting with 7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea not found: ID does not exist" containerID="7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.876101 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea"} err="failed to get container status \"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea\": rpc error: code = NotFound desc = could not find container \"7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea\": container with ID starting with 7567acbe6f9b88ae558a97c1b308241e8a9004d37e60ad7540703a0cb50d2fea not found: ID does not exist" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.876131 4782 scope.go:117] "RemoveContainer" containerID="b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666" Jan 30 20:01:33 crc kubenswrapper[4782]: E0130 20:01:33.878124 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666\": container with ID starting with b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666 not found: ID does not exist" containerID="b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.878145 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666"} err="failed to get container status \"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666\": rpc error: code = NotFound desc = could not find container \"b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666\": container with ID starting with b34fbe71ef37da5956efb052779704358ec1502bfb02f5faa0e42d5a3b5ca666 not found: ID does not exist" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.878159 4782 scope.go:117] "RemoveContainer" containerID="119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005" Jan 30 20:01:33 crc kubenswrapper[4782]: E0130 20:01:33.878526 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005\": container with ID starting with 119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005 not found: ID does not exist" containerID="119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005" Jan 30 20:01:33 crc kubenswrapper[4782]: I0130 20:01:33.878560 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005"} err="failed to get container status \"119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005\": rpc error: code = NotFound desc = could not find container \"119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005\": container with ID starting with 119df2760baaa8a53bf2117e371c3800accf8db89b49de2c0102c239f4a91005 not found: ID does not exist" Jan 30 20:01:34 crc kubenswrapper[4782]: I0130 20:01:34.439219 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54578c77-450f-479f-90f5-69a91117ff17" path="/var/lib/kubelet/pods/54578c77-450f-479f-90f5-69a91117ff17/volumes" Jan 30 20:01:38 crc kubenswrapper[4782]: I0130 20:01:38.412024 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:01:38 crc kubenswrapper[4782]: E0130 20:01:38.413163 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.036253 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-nlj8p_b45e2233-e51f-4f71-bc45-cd73fa8302de/kube-rbac-proxy/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.268750 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-nlj8p_b45e2233-e51f-4f71-bc45-cd73fa8302de/controller/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.367130 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-m7rqg_67ea755b-acbd-4894-9070-356cb15f18d3/frr-k8s-webhook-server/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.468000 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.600353 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.628968 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.658701 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.720741 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.857470 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.870775 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.890855 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:01:41 crc kubenswrapper[4782]: I0130 20:01:41.909815 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.094542 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.103004 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.108985 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.154832 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/controller/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.293090 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/frr-metrics/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.357938 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/kube-rbac-proxy-frr/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.377162 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/kube-rbac-proxy/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.509142 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/reloader/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.676855 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-597897467b-d7mjb_a536d77e-78b4-4ec2-a0d2-80e853e186fb/manager/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.869555 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.881283 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bccd97cd9-rmxjs_c696687d-14f1-4f3b-b9ee-36e3845aa7c2/webhook-server/0.log" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.924416 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:42 crc kubenswrapper[4782]: I0130 20:01:42.956848 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jqzpm_824018fe-7708-4c75-aaac-19bfb9f22405/kube-rbac-proxy/0.log" Jan 30 20:01:43 crc kubenswrapper[4782]: I0130 20:01:43.108979 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:43 crc kubenswrapper[4782]: I0130 20:01:43.525899 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jqzpm_824018fe-7708-4c75-aaac-19bfb9f22405/speaker/0.log" Jan 30 20:01:43 crc kubenswrapper[4782]: I0130 20:01:43.950743 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/frr/0.log" Jan 30 20:01:44 crc kubenswrapper[4782]: I0130 20:01:44.862924 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dx47j" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="registry-server" containerID="cri-o://5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9" gracePeriod=2 Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.320615 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.450546 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhg7k\" (UniqueName: \"kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k\") pod \"76ccd1a2-b019-4797-84a4-63c6df5b3048\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.450749 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content\") pod \"76ccd1a2-b019-4797-84a4-63c6df5b3048\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.450892 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities\") pod \"76ccd1a2-b019-4797-84a4-63c6df5b3048\" (UID: \"76ccd1a2-b019-4797-84a4-63c6df5b3048\") " Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.451830 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities" (OuterVolumeSpecName: "utilities") pod "76ccd1a2-b019-4797-84a4-63c6df5b3048" (UID: "76ccd1a2-b019-4797-84a4-63c6df5b3048"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.460393 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k" (OuterVolumeSpecName: "kube-api-access-qhg7k") pod "76ccd1a2-b019-4797-84a4-63c6df5b3048" (UID: "76ccd1a2-b019-4797-84a4-63c6df5b3048"). InnerVolumeSpecName "kube-api-access-qhg7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.553418 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhg7k\" (UniqueName: \"kubernetes.io/projected/76ccd1a2-b019-4797-84a4-63c6df5b3048-kube-api-access-qhg7k\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.553480 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.587090 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76ccd1a2-b019-4797-84a4-63c6df5b3048" (UID: "76ccd1a2-b019-4797-84a4-63c6df5b3048"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.655334 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76ccd1a2-b019-4797-84a4-63c6df5b3048-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.875999 4782 generic.go:334] "Generic (PLEG): container finished" podID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerID="5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9" exitCode=0 Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.876064 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerDied","Data":"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9"} Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.876133 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dx47j" event={"ID":"76ccd1a2-b019-4797-84a4-63c6df5b3048","Type":"ContainerDied","Data":"35879f435bfd05ad023bbce9c9f8b6c43139cc204f9a563de5d729dc9fb0fe54"} Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.876164 4782 scope.go:117] "RemoveContainer" containerID="5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.876088 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dx47j" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.934398 4782 scope.go:117] "RemoveContainer" containerID="bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b" Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.943168 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.956127 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dx47j"] Jan 30 20:01:45 crc kubenswrapper[4782]: I0130 20:01:45.961429 4782 scope.go:117] "RemoveContainer" containerID="a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.008148 4782 scope.go:117] "RemoveContainer" containerID="5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9" Jan 30 20:01:46 crc kubenswrapper[4782]: E0130 20:01:46.009200 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9\": container with ID starting with 5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9 not found: ID does not exist" containerID="5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.009281 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9"} err="failed to get container status \"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9\": rpc error: code = NotFound desc = could not find container \"5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9\": container with ID starting with 5a0411f6a72a51c99d2946dad69e3b9e09b1d00af17a64275d89f828305003a9 not found: ID does not exist" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.009345 4782 scope.go:117] "RemoveContainer" containerID="bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b" Jan 30 20:01:46 crc kubenswrapper[4782]: E0130 20:01:46.009750 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b\": container with ID starting with bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b not found: ID does not exist" containerID="bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.009799 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b"} err="failed to get container status \"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b\": rpc error: code = NotFound desc = could not find container \"bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b\": container with ID starting with bd8bc2b344e8c04c9d9b3a8a0fdbdd0e96615c50541930d840a678fb0a514e2b not found: ID does not exist" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.009837 4782 scope.go:117] "RemoveContainer" containerID="a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740" Jan 30 20:01:46 crc kubenswrapper[4782]: E0130 20:01:46.010161 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740\": container with ID starting with a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740 not found: ID does not exist" containerID="a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.010193 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740"} err="failed to get container status \"a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740\": rpc error: code = NotFound desc = could not find container \"a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740\": container with ID starting with a5460984a9642063130afceced330907e694ad1f3978af8df44a34a0f4738740 not found: ID does not exist" Jan 30 20:01:46 crc kubenswrapper[4782]: I0130 20:01:46.421981 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" path="/var/lib/kubelet/pods/76ccd1a2-b019-4797-84a4-63c6df5b3048/volumes" Jan 30 20:01:52 crc kubenswrapper[4782]: I0130 20:01:52.411397 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:01:52 crc kubenswrapper[4782]: E0130 20:01:52.412306 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.316521 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.433718 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.450124 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.497854 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.664980 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.666158 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.667790 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/extract/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.828670 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:01:58 crc kubenswrapper[4782]: I0130 20:01:58.986432 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.035179 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.036102 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.176992 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.185800 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.214330 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/extract/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.348621 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.475660 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.486210 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.495148 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.682455 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.724452 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.737971 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/extract/0.log" Jan 30 20:01:59 crc kubenswrapper[4782]: I0130 20:01:59.866005 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:02:00 crc kubenswrapper[4782]: I0130 20:02:00.006387 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:02:00 crc kubenswrapper[4782]: I0130 20:02:00.029359 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:02:00 crc kubenswrapper[4782]: I0130 20:02:00.030908 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:02:00 crc kubenswrapper[4782]: I0130 20:02:00.870771 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:02:00 crc kubenswrapper[4782]: I0130 20:02:00.905725 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.081027 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.314347 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.390258 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.409500 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.581955 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.604945 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.626402 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/registry-server/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.894412 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qsfzf_5d993e9b-840e-4235-9d1e-9d2cf1928afc/marketplace-operator/0.log" Jan 30 20:02:01 crc kubenswrapper[4782]: I0130 20:02:01.973072 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.133136 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.168593 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.204677 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.392178 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/registry-server/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.777085 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.788661 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.892855 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.935729 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/registry-server/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.979945 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:02:02 crc kubenswrapper[4782]: I0130 20:02:02.984066 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:02:03 crc kubenswrapper[4782]: I0130 20:02:03.020129 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:02:03 crc kubenswrapper[4782]: I0130 20:02:03.183591 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:02:03 crc kubenswrapper[4782]: I0130 20:02:03.189297 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:02:03 crc kubenswrapper[4782]: I0130 20:02:03.825320 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/registry-server/0.log" Jan 30 20:02:05 crc kubenswrapper[4782]: I0130 20:02:05.411804 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:02:05 crc kubenswrapper[4782]: E0130 20:02:05.412937 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:02:17 crc kubenswrapper[4782]: I0130 20:02:17.581544 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kmgbq_92e82803-8b7d-46f3-ba40-2900590261cf/prometheus-operator/0.log" Jan 30 20:02:17 crc kubenswrapper[4782]: I0130 20:02:17.586735 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp_05098fbb-e910-4fec-8a31-fd98d476b941/prometheus-operator-admission-webhook/0.log" Jan 30 20:02:17 crc kubenswrapper[4782]: I0130 20:02:17.601357 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22_19b18d8a-aa0f-494e-9e56-55bceba788c6/prometheus-operator-admission-webhook/0.log" Jan 30 20:02:17 crc kubenswrapper[4782]: I0130 20:02:17.801931 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w89kr_616c3ea8-075a-475f-9896-180a02e4cc3f/perses-operator/0.log" Jan 30 20:02:17 crc kubenswrapper[4782]: I0130 20:02:17.819089 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pcglc_786ed08c-6b06-4e44-aaf4-5562ef433b88/operator/0.log" Jan 30 20:02:20 crc kubenswrapper[4782]: I0130 20:02:20.411190 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:02:20 crc kubenswrapper[4782]: E0130 20:02:20.411944 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:02:22 crc kubenswrapper[4782]: E0130 20:02:22.920473 4782 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:35470->38.102.83.212:36463: write tcp 38.102.83.212:35470->38.102.83.212:36463: write: broken pipe Jan 30 20:02:31 crc kubenswrapper[4782]: I0130 20:02:31.411184 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:02:31 crc kubenswrapper[4782]: E0130 20:02:31.411955 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:02:46 crc kubenswrapper[4782]: I0130 20:02:46.411953 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:02:46 crc kubenswrapper[4782]: E0130 20:02:46.416309 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:02:57 crc kubenswrapper[4782]: I0130 20:02:57.411356 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:02:57 crc kubenswrapper[4782]: E0130 20:02:57.412396 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:03:08 crc kubenswrapper[4782]: I0130 20:03:08.448587 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:03:08 crc kubenswrapper[4782]: E0130 20:03:08.449598 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:03:21 crc kubenswrapper[4782]: I0130 20:03:21.411748 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:03:21 crc kubenswrapper[4782]: I0130 20:03:21.910629 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393"} Jan 30 20:04:18 crc kubenswrapper[4782]: I0130 20:04:18.559961 4782 generic.go:334] "Generic (PLEG): container finished" podID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerID="faa53245cf537f46902137d0951d27d0b3c87121b856bd61b49b393ff7d7d2af" exitCode=0 Jan 30 20:04:18 crc kubenswrapper[4782]: I0130 20:04:18.560092 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" event={"ID":"ec9dc11b-a597-42b6-859c-1987db75b2d0","Type":"ContainerDied","Data":"faa53245cf537f46902137d0951d27d0b3c87121b856bd61b49b393ff7d7d2af"} Jan 30 20:04:18 crc kubenswrapper[4782]: I0130 20:04:18.561359 4782 scope.go:117] "RemoveContainer" containerID="faa53245cf537f46902137d0951d27d0b3c87121b856bd61b49b393ff7d7d2af" Jan 30 20:04:19 crc kubenswrapper[4782]: I0130 20:04:19.433520 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9b9gz_must-gather-2fb9w_ec9dc11b-a597-42b6-859c-1987db75b2d0/gather/0.log" Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.276526 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9b9gz/must-gather-2fb9w"] Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.277339 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="copy" containerID="cri-o://c06df32bbcf2b79131bbd4d7ce77be91743c3a0d7ddd07f7c97fa357e42f7050" gracePeriod=2 Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.288789 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9b9gz/must-gather-2fb9w"] Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.669776 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9b9gz_must-gather-2fb9w_ec9dc11b-a597-42b6-859c-1987db75b2d0/copy/0.log" Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.670745 4782 generic.go:334] "Generic (PLEG): container finished" podID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerID="c06df32bbcf2b79131bbd4d7ce77be91743c3a0d7ddd07f7c97fa357e42f7050" exitCode=143 Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.906709 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9b9gz_must-gather-2fb9w_ec9dc11b-a597-42b6-859c-1987db75b2d0/copy/0.log" Jan 30 20:04:27 crc kubenswrapper[4782]: I0130 20:04:27.907181 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.072597 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk45d\" (UniqueName: \"kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d\") pod \"ec9dc11b-a597-42b6-859c-1987db75b2d0\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.072697 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output\") pod \"ec9dc11b-a597-42b6-859c-1987db75b2d0\" (UID: \"ec9dc11b-a597-42b6-859c-1987db75b2d0\") " Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.082045 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d" (OuterVolumeSpecName: "kube-api-access-zk45d") pod "ec9dc11b-a597-42b6-859c-1987db75b2d0" (UID: "ec9dc11b-a597-42b6-859c-1987db75b2d0"). InnerVolumeSpecName "kube-api-access-zk45d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.175387 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk45d\" (UniqueName: \"kubernetes.io/projected/ec9dc11b-a597-42b6-859c-1987db75b2d0-kube-api-access-zk45d\") on node \"crc\" DevicePath \"\"" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.268421 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ec9dc11b-a597-42b6-859c-1987db75b2d0" (UID: "ec9dc11b-a597-42b6-859c-1987db75b2d0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.278859 4782 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ec9dc11b-a597-42b6-859c-1987db75b2d0-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.425939 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" path="/var/lib/kubelet/pods/ec9dc11b-a597-42b6-859c-1987db75b2d0/volumes" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.685928 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9b9gz_must-gather-2fb9w_ec9dc11b-a597-42b6-859c-1987db75b2d0/copy/0.log" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.686733 4782 scope.go:117] "RemoveContainer" containerID="c06df32bbcf2b79131bbd4d7ce77be91743c3a0d7ddd07f7c97fa357e42f7050" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.686789 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9b9gz/must-gather-2fb9w" Jan 30 20:04:28 crc kubenswrapper[4782]: I0130 20:04:28.729801 4782 scope.go:117] "RemoveContainer" containerID="faa53245cf537f46902137d0951d27d0b3c87121b856bd61b49b393ff7d7d2af" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.506841 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.507882 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="copy" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.507898 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="copy" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.507922 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.507931 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.507955 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="extract-utilities" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.507964 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="extract-utilities" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.507979 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="extract-content" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.507987 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="extract-content" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.508003 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="extract-utilities" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508011 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="extract-utilities" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.508038 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508046 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.508058 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="extract-content" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508065 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="extract-content" Jan 30 20:04:36 crc kubenswrapper[4782]: E0130 20:04:36.508082 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="gather" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508091 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="gather" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508342 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="gather" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508355 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="76ccd1a2-b019-4797-84a4-63c6df5b3048" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508380 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9dc11b-a597-42b6-859c-1987db75b2d0" containerName="copy" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.508395 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="54578c77-450f-479f-90f5-69a91117ff17" containerName="registry-server" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.510175 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.524708 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.571768 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.571903 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.572018 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcxn\" (UniqueName: \"kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.674340 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcxn\" (UniqueName: \"kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.674440 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.674508 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.675022 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.675073 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.694841 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcxn\" (UniqueName: \"kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn\") pod \"community-operators-t98zx\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:36 crc kubenswrapper[4782]: I0130 20:04:36.838626 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:37 crc kubenswrapper[4782]: I0130 20:04:37.460071 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:37 crc kubenswrapper[4782]: I0130 20:04:37.799622 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerStarted","Data":"c4ca8a0053d28cd95181d22f4cc4415c1d11ef64b7de41dbd9b1c3cb55609511"} Jan 30 20:04:37 crc kubenswrapper[4782]: I0130 20:04:37.799944 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerStarted","Data":"c7272377ecd626f512a0f85bcbfb4ae2f8ca16b5b31599305e6d121cc541f8c2"} Jan 30 20:04:37 crc kubenswrapper[4782]: I0130 20:04:37.803269 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 20:04:38 crc kubenswrapper[4782]: I0130 20:04:38.810357 4782 generic.go:334] "Generic (PLEG): container finished" podID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerID="c4ca8a0053d28cd95181d22f4cc4415c1d11ef64b7de41dbd9b1c3cb55609511" exitCode=0 Jan 30 20:04:38 crc kubenswrapper[4782]: I0130 20:04:38.810440 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerDied","Data":"c4ca8a0053d28cd95181d22f4cc4415c1d11ef64b7de41dbd9b1c3cb55609511"} Jan 30 20:04:39 crc kubenswrapper[4782]: I0130 20:04:39.823101 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerStarted","Data":"32007611f1facec0316bcded60962382542ed69aa85eaec5b977b7b8da960e76"} Jan 30 20:04:40 crc kubenswrapper[4782]: I0130 20:04:40.833903 4782 generic.go:334] "Generic (PLEG): container finished" podID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerID="32007611f1facec0316bcded60962382542ed69aa85eaec5b977b7b8da960e76" exitCode=0 Jan 30 20:04:40 crc kubenswrapper[4782]: I0130 20:04:40.834022 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerDied","Data":"32007611f1facec0316bcded60962382542ed69aa85eaec5b977b7b8da960e76"} Jan 30 20:04:41 crc kubenswrapper[4782]: I0130 20:04:41.847038 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerStarted","Data":"fc6896929383b3309d8116f2b21f3f16748ffbe68ada600974acdf67e23683d5"} Jan 30 20:04:46 crc kubenswrapper[4782]: I0130 20:04:46.838910 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:46 crc kubenswrapper[4782]: I0130 20:04:46.839537 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:46 crc kubenswrapper[4782]: I0130 20:04:46.889966 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:46 crc kubenswrapper[4782]: I0130 20:04:46.925792 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t98zx" podStartSLOduration=7.386941914 podStartE2EDuration="10.925762506s" podCreationTimestamp="2026-01-30 20:04:36 +0000 UTC" firstStartedPulling="2026-01-30 20:04:37.802963807 +0000 UTC m=+5654.071341832" lastFinishedPulling="2026-01-30 20:04:41.341784359 +0000 UTC m=+5657.610162424" observedRunningTime="2026-01-30 20:04:41.869813542 +0000 UTC m=+5658.138191567" watchObservedRunningTime="2026-01-30 20:04:46.925762506 +0000 UTC m=+5663.194140521" Jan 30 20:04:46 crc kubenswrapper[4782]: I0130 20:04:46.968676 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:47 crc kubenswrapper[4782]: I0130 20:04:47.134162 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:48 crc kubenswrapper[4782]: I0130 20:04:48.911990 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t98zx" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="registry-server" containerID="cri-o://fc6896929383b3309d8116f2b21f3f16748ffbe68ada600974acdf67e23683d5" gracePeriod=2 Jan 30 20:04:49 crc kubenswrapper[4782]: I0130 20:04:49.922250 4782 generic.go:334] "Generic (PLEG): container finished" podID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerID="fc6896929383b3309d8116f2b21f3f16748ffbe68ada600974acdf67e23683d5" exitCode=0 Jan 30 20:04:49 crc kubenswrapper[4782]: I0130 20:04:49.922268 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerDied","Data":"fc6896929383b3309d8116f2b21f3f16748ffbe68ada600974acdf67e23683d5"} Jan 30 20:04:49 crc kubenswrapper[4782]: I0130 20:04:49.922593 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t98zx" event={"ID":"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8","Type":"ContainerDied","Data":"c7272377ecd626f512a0f85bcbfb4ae2f8ca16b5b31599305e6d121cc541f8c2"} Jan 30 20:04:49 crc kubenswrapper[4782]: I0130 20:04:49.922609 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7272377ecd626f512a0f85bcbfb4ae2f8ca16b5b31599305e6d121cc541f8c2" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.043750 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.151088 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content\") pod \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.151158 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcxn\" (UniqueName: \"kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn\") pod \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.151280 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities\") pod \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\" (UID: \"2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8\") " Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.152428 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities" (OuterVolumeSpecName: "utilities") pod "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" (UID: "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.159594 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn" (OuterVolumeSpecName: "kube-api-access-dqcxn") pod "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" (UID: "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8"). InnerVolumeSpecName "kube-api-access-dqcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.213093 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" (UID: "2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.253132 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.253171 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcxn\" (UniqueName: \"kubernetes.io/projected/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-kube-api-access-dqcxn\") on node \"crc\" DevicePath \"\"" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.253180 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.933866 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t98zx" Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.960858 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:50 crc kubenswrapper[4782]: I0130 20:04:50.971187 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t98zx"] Jan 30 20:04:52 crc kubenswrapper[4782]: I0130 20:04:52.421587 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" path="/var/lib/kubelet/pods/2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8/volumes" Jan 30 20:05:49 crc kubenswrapper[4782]: I0130 20:05:49.792936 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:05:49 crc kubenswrapper[4782]: I0130 20:05:49.793558 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:06:19 crc kubenswrapper[4782]: I0130 20:06:19.793197 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:06:19 crc kubenswrapper[4782]: I0130 20:06:19.793931 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:06:49 crc kubenswrapper[4782]: I0130 20:06:49.792382 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:06:49 crc kubenswrapper[4782]: I0130 20:06:49.792995 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:06:49 crc kubenswrapper[4782]: I0130 20:06:49.793070 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 20:06:49 crc kubenswrapper[4782]: I0130 20:06:49.794207 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 20:06:49 crc kubenswrapper[4782]: I0130 20:06:49.794301 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393" gracePeriod=600 Jan 30 20:06:50 crc kubenswrapper[4782]: I0130 20:06:50.243019 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393" exitCode=0 Jan 30 20:06:50 crc kubenswrapper[4782]: I0130 20:06:50.243111 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393"} Jan 30 20:06:50 crc kubenswrapper[4782]: I0130 20:06:50.243463 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04"} Jan 30 20:06:50 crc kubenswrapper[4782]: I0130 20:06:50.243495 4782 scope.go:117] "RemoveContainer" containerID="9f4a10c0fa64f3728c1a64414f532e52d7c4116c4bb8e0a40f6e60ac40d6e8df" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.215430 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5bgg/must-gather-2nzsh"] Jan 30 20:07:39 crc kubenswrapper[4782]: E0130 20:07:39.216549 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="registry-server" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.216657 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="registry-server" Jan 30 20:07:39 crc kubenswrapper[4782]: E0130 20:07:39.216681 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="extract-content" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.216689 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="extract-content" Jan 30 20:07:39 crc kubenswrapper[4782]: E0130 20:07:39.216719 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="extract-utilities" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.216728 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="extract-utilities" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.217024 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae964a2-2b8e-4eca-9148-d4d4ecdd10a8" containerName="registry-server" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.218580 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.222768 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w5bgg"/"openshift-service-ca.crt" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.222960 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-w5bgg"/"kube-root-ca.crt" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.305118 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.305206 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2znf2\" (UniqueName: \"kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.329062 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w5bgg/must-gather-2nzsh"] Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.406859 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.406940 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2znf2\" (UniqueName: \"kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.407392 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.449793 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2znf2\" (UniqueName: \"kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2\") pod \"must-gather-2nzsh\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:39 crc kubenswrapper[4782]: I0130 20:07:39.539512 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:07:40 crc kubenswrapper[4782]: I0130 20:07:40.152694 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w5bgg/must-gather-2nzsh"] Jan 30 20:07:40 crc kubenswrapper[4782]: W0130 20:07:40.161452 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf498c4c6_fa9e_4e2c_a41d_217b45043e07.slice/crio-eb2c9b592c567d5219decd8ce21feff53d88067c5ca10715659dce093378dca3 WatchSource:0}: Error finding container eb2c9b592c567d5219decd8ce21feff53d88067c5ca10715659dce093378dca3: Status 404 returned error can't find the container with id eb2c9b592c567d5219decd8ce21feff53d88067c5ca10715659dce093378dca3 Jan 30 20:07:40 crc kubenswrapper[4782]: I0130 20:07:40.799472 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" event={"ID":"f498c4c6-fa9e-4e2c-a41d-217b45043e07","Type":"ContainerStarted","Data":"8583b5ca2cc0a5e1a2db33736d987066b50218486f716cfb75d0aa895c94e5bd"} Jan 30 20:07:40 crc kubenswrapper[4782]: I0130 20:07:40.799995 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" event={"ID":"f498c4c6-fa9e-4e2c-a41d-217b45043e07","Type":"ContainerStarted","Data":"86f473b437bf96416d0ab246050d7b1fe8044ecc55ee521f0908d36a6d275722"} Jan 30 20:07:40 crc kubenswrapper[4782]: I0130 20:07:40.800012 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" event={"ID":"f498c4c6-fa9e-4e2c-a41d-217b45043e07","Type":"ContainerStarted","Data":"eb2c9b592c567d5219decd8ce21feff53d88067c5ca10715659dce093378dca3"} Jan 30 20:07:40 crc kubenswrapper[4782]: I0130 20:07:40.821194 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" podStartSLOduration=1.821176344 podStartE2EDuration="1.821176344s" podCreationTimestamp="2026-01-30 20:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 20:07:40.810211624 +0000 UTC m=+5837.078589639" watchObservedRunningTime="2026-01-30 20:07:40.821176344 +0000 UTC m=+5837.089554369" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.356754 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-5bgtz"] Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.358363 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.361700 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w5bgg"/"default-dockercfg-xm77r" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.406709 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd5cc\" (UniqueName: \"kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.406864 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.513705 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd5cc\" (UniqueName: \"kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.513864 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.515126 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.554847 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd5cc\" (UniqueName: \"kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc\") pod \"crc-debug-5bgtz\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.715906 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:07:44 crc kubenswrapper[4782]: W0130 20:07:44.775408 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf97f47_1e3c_4d4b_a723_0e1bdeedfb93.slice/crio-10e028a502196f18f400f587990845a8d3c4768fcce62f0553e3df6603163d68 WatchSource:0}: Error finding container 10e028a502196f18f400f587990845a8d3c4768fcce62f0553e3df6603163d68: Status 404 returned error can't find the container with id 10e028a502196f18f400f587990845a8d3c4768fcce62f0553e3df6603163d68 Jan 30 20:07:44 crc kubenswrapper[4782]: I0130 20:07:44.844031 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" event={"ID":"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93","Type":"ContainerStarted","Data":"10e028a502196f18f400f587990845a8d3c4768fcce62f0553e3df6603163d68"} Jan 30 20:07:45 crc kubenswrapper[4782]: I0130 20:07:45.856815 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" event={"ID":"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93","Type":"ContainerStarted","Data":"326d471868d4b9fd74604eba71257fc90157f87c7ce5b10f6736b49bea4085a3"} Jan 30 20:07:45 crc kubenswrapper[4782]: I0130 20:07:45.879354 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" podStartSLOduration=1.8793283779999999 podStartE2EDuration="1.879328378s" podCreationTimestamp="2026-01-30 20:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 20:07:45.874103939 +0000 UTC m=+5842.142481974" watchObservedRunningTime="2026-01-30 20:07:45.879328378 +0000 UTC m=+5842.147706443" Jan 30 20:08:24 crc kubenswrapper[4782]: I0130 20:08:24.190586 4782 generic.go:334] "Generic (PLEG): container finished" podID="9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" containerID="326d471868d4b9fd74604eba71257fc90157f87c7ce5b10f6736b49bea4085a3" exitCode=0 Jan 30 20:08:24 crc kubenswrapper[4782]: I0130 20:08:24.190722 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" event={"ID":"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93","Type":"ContainerDied","Data":"326d471868d4b9fd74604eba71257fc90157f87c7ce5b10f6736b49bea4085a3"} Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.355254 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.390019 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd5cc\" (UniqueName: \"kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc\") pod \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.390124 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host\") pod \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\" (UID: \"9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93\") " Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.390453 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host" (OuterVolumeSpecName: "host") pod "9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" (UID: "9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.391021 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-5bgtz"] Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.391162 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-host\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.395203 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc" (OuterVolumeSpecName: "kube-api-access-bd5cc") pod "9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" (UID: "9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93"). InnerVolumeSpecName "kube-api-access-bd5cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.402661 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-5bgtz"] Jan 30 20:08:25 crc kubenswrapper[4782]: I0130 20:08:25.493694 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd5cc\" (UniqueName: \"kubernetes.io/projected/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93-kube-api-access-bd5cc\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.214980 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e028a502196f18f400f587990845a8d3c4768fcce62f0553e3df6603163d68" Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.215067 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-5bgtz" Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.424030 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" path="/var/lib/kubelet/pods/9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93/volumes" Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.990099 4782 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qsfzf container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.990605 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-qsfzf" podUID="5d993e9b-840e-4235-9d1e-9d2cf1928afc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 20:08:26 crc kubenswrapper[4782]: I0130 20:08:26.993478 4782 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-s26rq" podUID="f8175d03-4ab5-4ed7-ab43-c722ef6a33b3" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.72:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.050768 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xnhhx"] Jan 30 20:08:27 crc kubenswrapper[4782]: E0130 20:08:27.051259 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" containerName="container-00" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.051284 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" containerName="container-00" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.051576 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf97f47-1e3c-4d4b-a723-0e1bdeedfb93" containerName="container-00" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.056504 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.059138 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w5bgg"/"default-dockercfg-xm77r" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.187538 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6q2\" (UniqueName: \"kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.187658 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.289905 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6q2\" (UniqueName: \"kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.290048 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.290275 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.319493 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6q2\" (UniqueName: \"kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2\") pod \"crc-debug-xnhhx\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:27 crc kubenswrapper[4782]: I0130 20:08:27.374716 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:28 crc kubenswrapper[4782]: I0130 20:08:28.238949 4782 generic.go:334] "Generic (PLEG): container finished" podID="ec648018-db8b-499d-b7e7-d755c4254fee" containerID="6172c80be7d78138de838feea6af5a3710f62b0f558b53ea22defa907249b1a9" exitCode=0 Jan 30 20:08:28 crc kubenswrapper[4782]: I0130 20:08:28.239019 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" event={"ID":"ec648018-db8b-499d-b7e7-d755c4254fee","Type":"ContainerDied","Data":"6172c80be7d78138de838feea6af5a3710f62b0f558b53ea22defa907249b1a9"} Jan 30 20:08:28 crc kubenswrapper[4782]: I0130 20:08:28.239842 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" event={"ID":"ec648018-db8b-499d-b7e7-d755c4254fee","Type":"ContainerStarted","Data":"66e4b0e0a22b318b5c84fb97c62fe21f1dfae239bc13b050996ec26086e0d96e"} Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.346409 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.531061 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host\") pod \"ec648018-db8b-499d-b7e7-d755c4254fee\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.531306 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host" (OuterVolumeSpecName: "host") pod "ec648018-db8b-499d-b7e7-d755c4254fee" (UID: "ec648018-db8b-499d-b7e7-d755c4254fee"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.531559 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft6q2\" (UniqueName: \"kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2\") pod \"ec648018-db8b-499d-b7e7-d755c4254fee\" (UID: \"ec648018-db8b-499d-b7e7-d755c4254fee\") " Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.532509 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec648018-db8b-499d-b7e7-d755c4254fee-host\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.548097 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2" (OuterVolumeSpecName: "kube-api-access-ft6q2") pod "ec648018-db8b-499d-b7e7-d755c4254fee" (UID: "ec648018-db8b-499d-b7e7-d755c4254fee"). InnerVolumeSpecName "kube-api-access-ft6q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:08:29 crc kubenswrapper[4782]: I0130 20:08:29.633897 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft6q2\" (UniqueName: \"kubernetes.io/projected/ec648018-db8b-499d-b7e7-d755c4254fee-kube-api-access-ft6q2\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.255042 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" event={"ID":"ec648018-db8b-499d-b7e7-d755c4254fee","Type":"ContainerDied","Data":"66e4b0e0a22b318b5c84fb97c62fe21f1dfae239bc13b050996ec26086e0d96e"} Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.255086 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xnhhx" Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.255088 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e4b0e0a22b318b5c84fb97c62fe21f1dfae239bc13b050996ec26086e0d96e" Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.336854 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xnhhx"] Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.344423 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xnhhx"] Jan 30 20:08:30 crc kubenswrapper[4782]: I0130 20:08:30.421074 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec648018-db8b-499d-b7e7-d755c4254fee" path="/var/lib/kubelet/pods/ec648018-db8b-499d-b7e7-d755c4254fee/volumes" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.585088 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xfghj"] Jan 30 20:08:31 crc kubenswrapper[4782]: E0130 20:08:31.585729 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec648018-db8b-499d-b7e7-d755c4254fee" containerName="container-00" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.585740 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec648018-db8b-499d-b7e7-d755c4254fee" containerName="container-00" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.585953 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec648018-db8b-499d-b7e7-d755c4254fee" containerName="container-00" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.586633 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.589004 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-w5bgg"/"default-dockercfg-xm77r" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.672410 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.672756 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdn6g\" (UniqueName: \"kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.774092 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdn6g\" (UniqueName: \"kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.774216 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.774363 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.791752 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdn6g\" (UniqueName: \"kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g\") pod \"crc-debug-xfghj\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:31 crc kubenswrapper[4782]: I0130 20:08:31.901544 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:32 crc kubenswrapper[4782]: I0130 20:08:32.271603 4782 generic.go:334] "Generic (PLEG): container finished" podID="96358eef-b170-4e74-b565-f3cdd9f46c75" containerID="91f1a381a1671cee05675b4a581254c12f4d3d486d605e984db317554306efd9" exitCode=0 Jan 30 20:08:32 crc kubenswrapper[4782]: I0130 20:08:32.271706 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" event={"ID":"96358eef-b170-4e74-b565-f3cdd9f46c75","Type":"ContainerDied","Data":"91f1a381a1671cee05675b4a581254c12f4d3d486d605e984db317554306efd9"} Jan 30 20:08:32 crc kubenswrapper[4782]: I0130 20:08:32.271915 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" event={"ID":"96358eef-b170-4e74-b565-f3cdd9f46c75","Type":"ContainerStarted","Data":"080c92032999ed0d130410834308698714052ad057bb8d0ee61d932e1de9b16a"} Jan 30 20:08:32 crc kubenswrapper[4782]: I0130 20:08:32.307350 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xfghj"] Jan 30 20:08:32 crc kubenswrapper[4782]: I0130 20:08:32.318738 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5bgg/crc-debug-xfghj"] Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.398771 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.507090 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdn6g\" (UniqueName: \"kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g\") pod \"96358eef-b170-4e74-b565-f3cdd9f46c75\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.507378 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host\") pod \"96358eef-b170-4e74-b565-f3cdd9f46c75\" (UID: \"96358eef-b170-4e74-b565-f3cdd9f46c75\") " Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.507651 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host" (OuterVolumeSpecName: "host") pod "96358eef-b170-4e74-b565-f3cdd9f46c75" (UID: "96358eef-b170-4e74-b565-f3cdd9f46c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.508321 4782 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96358eef-b170-4e74-b565-f3cdd9f46c75-host\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.523521 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g" (OuterVolumeSpecName: "kube-api-access-zdn6g") pod "96358eef-b170-4e74-b565-f3cdd9f46c75" (UID: "96358eef-b170-4e74-b565-f3cdd9f46c75"). InnerVolumeSpecName "kube-api-access-zdn6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:08:33 crc kubenswrapper[4782]: I0130 20:08:33.621150 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdn6g\" (UniqueName: \"kubernetes.io/projected/96358eef-b170-4e74-b565-f3cdd9f46c75-kube-api-access-zdn6g\") on node \"crc\" DevicePath \"\"" Jan 30 20:08:34 crc kubenswrapper[4782]: I0130 20:08:34.289592 4782 scope.go:117] "RemoveContainer" containerID="91f1a381a1671cee05675b4a581254c12f4d3d486d605e984db317554306efd9" Jan 30 20:08:34 crc kubenswrapper[4782]: I0130 20:08:34.289907 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/crc-debug-xfghj" Jan 30 20:08:34 crc kubenswrapper[4782]: I0130 20:08:34.425169 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96358eef-b170-4e74-b565-f3cdd9f46c75" path="/var/lib/kubelet/pods/96358eef-b170-4e74-b565-f3cdd9f46c75/volumes" Jan 30 20:09:19 crc kubenswrapper[4782]: I0130 20:09:19.792680 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:09:19 crc kubenswrapper[4782]: I0130 20:09:19.793121 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:09:20 crc kubenswrapper[4782]: I0130 20:09:20.933618 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-564c766f5d-2hhs6_5162dd27-124a-4e1c-8a8c-51c4e47fce04/barbican-api/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.130762 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-564c766f5d-2hhs6_5162dd27-124a-4e1c-8a8c-51c4e47fce04/barbican-api-log/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.152857 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d6457597d-9bs7l_0575e76f-c529-41f7-8b65-87ec77ec9614/barbican-keystone-listener/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.203194 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-d6457597d-9bs7l_0575e76f-c529-41f7-8b65-87ec77ec9614/barbican-keystone-listener-log/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.322157 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d84b8b585-bfbrv_69354d0f-b465-419f-8fd1-b812a39312c5/barbican-worker/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.393840 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d84b8b585-bfbrv_69354d0f-b465-419f-8fd1-b812a39312c5/barbican-worker-log/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.558251 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bhrfj_f9e549bf-994f-46e6-9d42-72a655229b73/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.656631 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/ceilometer-central-agent/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.757834 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/ceilometer-notification-agent/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.801781 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/proxy-httpd/0.log" Jan 30 20:09:21 crc kubenswrapper[4782]: I0130 20:09:21.865112 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1666e348-ad78-40db-be34-e66ea72a6af8/sg-core/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.227400 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3d752ed-dc31-49b8-80ce-b3b94f07dcf3/cinder-api-log/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.529613 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2773f02f-26c1-4c26-a789-afc299bd11c1/probe/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.734979 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2773f02f-26c1-4c26-a789-afc299bd11c1/cinder-backup/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.826405 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3d752ed-dc31-49b8-80ce-b3b94f07dcf3/cinder-api/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.838440 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5ac7726e-05ca-4e51-99e2-cce317290a59/cinder-scheduler/0.log" Jan 30 20:09:22 crc kubenswrapper[4782]: I0130 20:09:22.936432 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5ac7726e-05ca-4e51-99e2-cce317290a59/probe/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.109549 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_eacda6b1-72d6-4a27-9aa5-c0b01309e9d9/probe/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.218168 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_eacda6b1-72d6-4a27-9aa5-c0b01309e9d9/cinder-volume/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.376260 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f0fe280a-4eaa-4dc5-8898-053826fd7131/probe/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.377324 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_f0fe280a-4eaa-4dc5-8898-053826fd7131/cinder-volume/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.457001 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jm7jg_c0f6bd17-4c2b-43b5-a3eb-0cdbe230a566/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.619091 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hbxpk_f9396100-4e8e-4e30-af8c-82043b59d08d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.703725 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/init/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.912490 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/init/0.log" Jan 30 20:09:23 crc kubenswrapper[4782]: I0130 20:09:23.937480 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-58pjf_92905892-4424-4957-a945-eb130f92d03f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.046783 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6dcf879fb5-dx4z8_a82aaec0-46a1-4f29-9c09-d4920bd1b315/dnsmasq-dns/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.166735 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397/glance-httpd/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.199987 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fc9ea4f2-c7bd-4f7e-a42d-0ccc7d15c397/glance-log/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.348205 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_974e57d1-5346-4863-a1e3-1b595eaa91b5/glance-log/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.390178 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_974e57d1-5346-4863-a1e3-1b595eaa91b5/glance-httpd/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.618929 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d67b5c94d-pwj69_53e414f7-9297-46fb-87b6-19ce7ee55758/horizon/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.730660 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zkx58_a4f9b344-67a5-4f16-99b1-d8402f3e44cb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:24 crc kubenswrapper[4782]: I0130 20:09:24.876890 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-z68rj_5b8169ab-3daf-43a7-a107-075317085df1/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.213785 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496661-m5ftp_dfdd6010-0f66-4ba1-8a6e-b0b5fed2a6c2/keystone-cron/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.330419 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d67b5c94d-pwj69_53e414f7-9297-46fb-87b6-19ce7ee55758/horizon-log/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.460254 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496721-qdvsh_87f9f5f2-88b7-4417-adac-110906eeceb5/keystone-cron/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.463427 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2faa1c8b-e69c-4b72-bc58-0d1a5e032d52/kube-state-metrics/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.654172 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5c6f877b5f-8gdbg_199910fe-a283-4898-bd2b-69b6e1b7266b/keystone-api/0.log" Jan 30 20:09:25 crc kubenswrapper[4782]: I0130 20:09:25.804254 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8cccm_31a7790e-b097-45c3-9088-5fc885e63ef8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:26 crc kubenswrapper[4782]: I0130 20:09:26.154396 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695d477669-wlmct_ed784e83-4524-4c3f-8697-ea3821f297b1/neutron-httpd/0.log" Jan 30 20:09:26 crc kubenswrapper[4782]: I0130 20:09:26.202762 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-695d477669-wlmct_ed784e83-4524-4c3f-8697-ea3821f297b1/neutron-api/0.log" Jan 30 20:09:26 crc kubenswrapper[4782]: I0130 20:09:26.300315 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lczlc_39dc3714-072a-4267-812c-49c2aa1efe2d/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:27 crc kubenswrapper[4782]: I0130 20:09:27.043265 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_860e5849-ad0b-4f89-87db-b839441f0dd9/nova-cell0-conductor-conductor/0.log" Jan 30 20:09:27 crc kubenswrapper[4782]: I0130 20:09:27.183829 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_71afc4ce-765f-4c71-a76e-6a4eff2b553d/nova-cell1-conductor-conductor/0.log" Jan 30 20:09:27 crc kubenswrapper[4782]: I0130 20:09:27.572318 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_858bbcbd-4a47-42ee-a581-2b03ca45dcaa/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 20:09:27 crc kubenswrapper[4782]: I0130 20:09:27.719775 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20743691-4aeb-4b01-a442-5df58c830c02/nova-api-log/0.log" Jan 30 20:09:27 crc kubenswrapper[4782]: I0130 20:09:27.784748 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-f8ss8_6e19180a-524d-4e70-8e9a-e72c69f07d7c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.041970 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_efd2ef42-aeac-48dd-9e95-fd000381dbfa/nova-metadata-log/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.314638 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_20743691-4aeb-4b01-a442-5df58c830c02/nova-api-api/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.530651 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/mysql-bootstrap/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.665067 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/mysql-bootstrap/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.669143 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1caadc78-c45b-4e64-ae44-a6f96bb41126/nova-scheduler-scheduler/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.701212 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a458f19f-501f-4703-9cfe-d8638418215b/galera/0.log" Jan 30 20:09:28 crc kubenswrapper[4782]: I0130 20:09:28.946278 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/mysql-bootstrap/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.122120 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/mysql-bootstrap/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.183996 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_070c9056-8c32-47ae-b937-b3e4b2b464e7/galera/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.331041 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8322d742-28bf-4eb4-ba33-8e37da0780f1/openstackclient/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.407993 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hcgcc_ed7f80e9-b13c-461c-b115-55b8ce9662dc/openstack-network-exporter/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.623885 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server-init/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.815812 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server-init/0.log" Jan 30 20:09:29 crc kubenswrapper[4782]: I0130 20:09:29.842763 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovsdb-server/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.030036 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pz2pk_91d457a1-1878-47f1-a1d3-eac450864978/ovn-controller/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.203733 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bk7c8_f3433e7d-6a6b-4f6b-b061-22479d5391f9/ovs-vswitchd/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.318642 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_efd2ef42-aeac-48dd-9e95-fd000381dbfa/nova-metadata-metadata/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.337799 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-qncfz_5e8ffe68-337e-40ee-a941-188e1bad9112/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.460838 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_547e7f64-963a-48bd-afa5-e908a3a716a2/openstack-network-exporter/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.555644 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_547e7f64-963a-48bd-afa5-e908a3a716a2/ovn-northd/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.625630 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5e12229-4958-47a9-9210-18fba05c1319/openstack-network-exporter/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.644826 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a5e12229-4958-47a9-9210-18fba05c1319/ovsdbserver-nb/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.852531 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6123a5a8-5a6d-455c-9418-71d31b35e2f3/ovsdbserver-sb/0.log" Jan 30 20:09:30 crc kubenswrapper[4782]: I0130 20:09:30.860006 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6123a5a8-5a6d-455c-9418-71d31b35e2f3/openstack-network-exporter/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.329862 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/init-config-reloader/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.441865 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6889757c94-v7jr9_da3c4b41-c384-4983-a704-e63d44f1fed9/placement-api/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.487788 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6889757c94-v7jr9_da3c4b41-c384-4983-a704-e63d44f1fed9/placement-log/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.506044 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/init-config-reloader/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.566963 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/config-reloader/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.629779 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/prometheus/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.681340 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_620ae2cd-1705-4975-92e7-32c6b559c37d/thanos-sidecar/0.log" Jan 30 20:09:31 crc kubenswrapper[4782]: I0130 20:09:31.824287 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.021935 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.081374 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.127901 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_75a1a86d-bec9-47a8-9031-21a30029c09d/rabbitmq/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.377086 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/rabbitmq/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.393961 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.415683 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_9e3b2844-afde-444d-b7ee-cddd8b543bf6/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.607421 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/rabbitmq/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.638617 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f74ddec0-3f55-44e4-80f4-2d4eac7a9093/setup-container/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.723266 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nt2hh_385f85fe-f3e6-4149-9241-ae72c3e9d52d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.910872 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-42r9p_f81e56d7-1142-4ef0-b4eb-b95e2fb08a3f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:32 crc kubenswrapper[4782]: I0130 20:09:32.973910 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-f7w79_77e26ddb-4b47-4b06-a390-76653b75c503/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.119354 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7st4m_f287af23-a5f5-4aa9-b9c2-9cd87fc26da3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.164450 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mvw5c_0cceb6ea-381a-4862-bbff-42f7ce3cbaf4/ssh-known-hosts-edpm-deployment/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.460669 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6989f95847-z8k6r_d86a7921-fdce-4a73-ad98-4dc1373c72e2/proxy-server/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.635754 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mbdqv_8a9f5c0e-8d43-437d-b47e-e72f03df077b/swift-ring-rebalance/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.646047 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6989f95847-z8k6r_d86a7921-fdce-4a73-ad98-4dc1373c72e2/proxy-httpd/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.685682 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-auditor/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.853403 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-replicator/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.876440 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-reaper/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.941035 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/account-server/0.log" Jan 30 20:09:33 crc kubenswrapper[4782]: I0130 20:09:33.947557 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-auditor/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.090761 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-server/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.098963 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-replicator/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.155860 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/container-updater/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.191257 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-auditor/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.287056 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-expirer/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.330993 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-server/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.337248 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-replicator/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.425661 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/object-updater/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.537204 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/rsync/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.549824 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15ef1358-db2b-4935-b53c-7aad2613cee7/swift-recon-cron/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.653118 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-lrdbt_0055025f-d7c7-4469-9791-ffcb0bbdfef4/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.834007 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_bd740cf2-1846-4d1e-902e-6ba7a54c0019/tempest-tests-tempest-tests-runner/0.log" Jan 30 20:09:34 crc kubenswrapper[4782]: I0130 20:09:34.877923 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8b2f36d4-09f5-46f6-9e28-f26004cc80bf/test-operator-logs-container/0.log" Jan 30 20:09:35 crc kubenswrapper[4782]: I0130 20:09:35.092037 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_7c3e9bb9-ed43-4499-88c1-2bde956a84b8/memcached/0.log" Jan 30 20:09:35 crc kubenswrapper[4782]: I0130 20:09:35.263641 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-klkrq_df904ca8-14f8-4f01-b67e-be59a86d4981/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 20:09:35 crc kubenswrapper[4782]: I0130 20:09:35.883051 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_1e15b60d-e1ab-4144-a82d-021b51750157/watcher-applier/0.log" Jan 30 20:09:36 crc kubenswrapper[4782]: I0130 20:09:36.428384 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_68609276-cd5e-43d1-bef5-c79ef0628d5b/watcher-api-log/0.log" Jan 30 20:09:38 crc kubenswrapper[4782]: I0130 20:09:38.566062 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_25e52062-f76c-4ebf-9738-8e5a9990aba9/watcher-decision-engine/0.log" Jan 30 20:09:39 crc kubenswrapper[4782]: I0130 20:09:39.458817 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_68609276-cd5e-43d1-bef5-c79ef0628d5b/watcher-api/0.log" Jan 30 20:09:49 crc kubenswrapper[4782]: I0130 20:09:49.792363 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:09:49 crc kubenswrapper[4782]: I0130 20:09:49.792971 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.433221 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-fmn9l_c5ce61cb-fdd7-4a3d-8c2d-0c87afc86828/manager/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.589304 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-wmvtm_9517a543-a9e5-4253-a1b1-4154cf20a70a/manager/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.629881 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-txtbj_d82d84b6-3009-480d-b614-fbd420d90f0e/manager/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.732847 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.901185 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.905525 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:10:02 crc kubenswrapper[4782]: I0130 20:10:02.941677 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.100392 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/util/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.100496 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/pull/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.146370 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e904b862232e54b09cd7de48735f7ce66f2921e60e15d7c96ff487fdbcfnnxc_cda5a397-ca19-4c00-97b7-f92b445ddecb/extract/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.320145 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-v85zd_f55acdec-57ab-4e5d-97df-ac13e7b749da/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.366771 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-gdx26_f03fb99f-3277-4bff-bcd2-93756326af54/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.490979 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-745gl_cd676b0f-9e48-461d-8381-998645228b54/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.743381 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-h4kqr_1cb2fc09-3cbc-4cee-8a31-04a050d8ff04/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.874519 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-4jb22_2ca6290f-bb8e-484d-84bd-d9e66b9f1471/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.912804 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-fg8bm_8b27955a-e2c6-43eb-953e-af3d66a687e3/manager/0.log" Jan 30 20:10:03 crc kubenswrapper[4782]: I0130 20:10:03.970307 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-ffbz8_8ec19937-0358-40cb-9fc0-de54ba844b62/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.112880 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-94rpc_ccfec61d-1461-4d91-a834-3170c98cf92f/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.218536 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-45lw5_79d06938-56c3-4ec4-a455-0fde260d8cdd/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.403222 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-78nps_5a54baf5-b3a2-4417-8caf-8fe321ff5f5f/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.445077 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-8phk5_0301eb58-f901-4952-9f7e-7764c0e67d7f/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.566009 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4d524dn_acd35126-a27d-4b4c-b56b-04ebd8358c74/manager/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.746746 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-678fbb89d4-gxzc4_48f3d327-7068-48e5-bd16-e8983d7dce53/operator/0.log" Jan 30 20:10:04 crc kubenswrapper[4782]: I0130 20:10:04.940379 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-v7mq6_09760161-4b39-4185-9c1e-917ba1924171/registry-server/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.134570 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-lqrfz_097a5bf9-6be5-4d4e-9547-f1318371e9db/manager/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.270074 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-vsmt2_c212e215-248f-4b93-9a70-b352f425648c/manager/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.462155 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8jcf8_fcdbdda2-62ba-4df8-9885-78c31d1e6157/operator/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.655815 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-rmkdk_52ac09b6-ec41-4ebc-ac18-018794fab085/manager/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.879737 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-khwrr_f0dd10e9-3f32-401a-a1d2-ba3e2ac503b3/manager/0.log" Jan 30 20:10:05 crc kubenswrapper[4782]: I0130 20:10:05.915792 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-brw4k_1794e6a9-01aa-43b7-841d-ca7bc24950f8/manager/0.log" Jan 30 20:10:06 crc kubenswrapper[4782]: I0130 20:10:06.018272 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-857dcb78d6-4vgqm_ffce2dc3-27a6-4caa-bcdf-3c9f5017c66c/manager/0.log" Jan 30 20:10:06 crc kubenswrapper[4782]: I0130 20:10:06.103295 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-78c8444fdd-928lz_a765979e-db86-4d07-8a0a-96c61d42137c/manager/0.log" Jan 30 20:10:19 crc kubenswrapper[4782]: I0130 20:10:19.793067 4782 patch_prober.go:28] interesting pod/machine-config-daemon-p7zdh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 20:10:19 crc kubenswrapper[4782]: I0130 20:10:19.793578 4782 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 20:10:19 crc kubenswrapper[4782]: I0130 20:10:19.793624 4782 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" Jan 30 20:10:19 crc kubenswrapper[4782]: I0130 20:10:19.794423 4782 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04"} pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 20:10:19 crc kubenswrapper[4782]: I0130 20:10:19.794475 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerName="machine-config-daemon" containerID="cri-o://fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" gracePeriod=600 Jan 30 20:10:19 crc kubenswrapper[4782]: E0130 20:10:19.925681 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:10:20 crc kubenswrapper[4782]: I0130 20:10:20.270980 4782 generic.go:334] "Generic (PLEG): container finished" podID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" exitCode=0 Jan 30 20:10:20 crc kubenswrapper[4782]: I0130 20:10:20.271047 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerDied","Data":"fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04"} Jan 30 20:10:20 crc kubenswrapper[4782]: I0130 20:10:20.271130 4782 scope.go:117] "RemoveContainer" containerID="e735abd3f92a6ae6e189a47ed0c4a0e4ad261bbd47b37e9b91f587b3a01cd393" Jan 30 20:10:20 crc kubenswrapper[4782]: I0130 20:10:20.271883 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:10:20 crc kubenswrapper[4782]: E0130 20:10:20.272142 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:10:25 crc kubenswrapper[4782]: I0130 20:10:25.609727 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-df42d_dd0f947d-ef9a-43ea-a5a0-7fe20d429739/control-plane-machine-set-operator/0.log" Jan 30 20:10:25 crc kubenswrapper[4782]: I0130 20:10:25.769317 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kr9hs_3b9097e3-f69b-49ae-9781-52921de78625/kube-rbac-proxy/0.log" Jan 30 20:10:25 crc kubenswrapper[4782]: I0130 20:10:25.817842 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kr9hs_3b9097e3-f69b-49ae-9781-52921de78625/machine-api-operator/0.log" Jan 30 20:10:32 crc kubenswrapper[4782]: I0130 20:10:32.411794 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:10:32 crc kubenswrapper[4782]: E0130 20:10:32.412748 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:10:40 crc kubenswrapper[4782]: I0130 20:10:40.193981 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-wspn9_afb307de-3731-434f-bbf7-3f8fcd8cd336/cert-manager-controller/0.log" Jan 30 20:10:40 crc kubenswrapper[4782]: I0130 20:10:40.307210 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8hdgf_bd00add1-aab0-4229-837f-7f79d71ad160/cert-manager-cainjector/0.log" Jan 30 20:10:40 crc kubenswrapper[4782]: I0130 20:10:40.443525 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s26rq_f8175d03-4ab5-4ed7-ab43-c722ef6a33b3/cert-manager-webhook/0.log" Jan 30 20:10:47 crc kubenswrapper[4782]: I0130 20:10:47.411978 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:10:47 crc kubenswrapper[4782]: E0130 20:10:47.412861 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:10:53 crc kubenswrapper[4782]: I0130 20:10:53.925449 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z6fgp_42d8b05a-8142-462f-b3ad-e496c30e8eea/nmstate-console-plugin/0.log" Jan 30 20:10:54 crc kubenswrapper[4782]: I0130 20:10:54.081151 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2blvc_61543235-f4f6-4320-b2ef-11521d91d360/nmstate-handler/0.log" Jan 30 20:10:54 crc kubenswrapper[4782]: I0130 20:10:54.125820 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5bs9t_8bab1b5d-f025-4df0-ba3c-d406621dd5ac/kube-rbac-proxy/0.log" Jan 30 20:10:54 crc kubenswrapper[4782]: I0130 20:10:54.197731 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-5bs9t_8bab1b5d-f025-4df0-ba3c-d406621dd5ac/nmstate-metrics/0.log" Jan 30 20:10:54 crc kubenswrapper[4782]: I0130 20:10:54.303957 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-prp59_37a25f92-459c-447c-846b-bfd73a950907/nmstate-operator/0.log" Jan 30 20:10:54 crc kubenswrapper[4782]: I0130 20:10:54.364654 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-hsdpj_7022f3b6-d4c1-4b83-b541-2125a53e701c/nmstate-webhook/0.log" Jan 30 20:10:58 crc kubenswrapper[4782]: I0130 20:10:58.411256 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:10:58 crc kubenswrapper[4782]: E0130 20:10:58.412883 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.370021 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:08 crc kubenswrapper[4782]: E0130 20:11:08.370896 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96358eef-b170-4e74-b565-f3cdd9f46c75" containerName="container-00" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.370908 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="96358eef-b170-4e74-b565-f3cdd9f46c75" containerName="container-00" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.371129 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="96358eef-b170-4e74-b565-f3cdd9f46c75" containerName="container-00" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.372543 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.383096 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.437928 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrrp\" (UniqueName: \"kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.437984 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.438026 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.540146 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrrp\" (UniqueName: \"kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.540195 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.540239 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.540755 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.540844 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.568345 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrrp\" (UniqueName: \"kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp\") pod \"redhat-marketplace-rbj5f\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.690773 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:08 crc kubenswrapper[4782]: I0130 20:11:08.988074 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kmgbq_92e82803-8b7d-46f3-ba40-2900590261cf/prometheus-operator/0.log" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.181165 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.312637 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp_05098fbb-e910-4fec-8a31-fd98d476b941/prometheus-operator-admission-webhook/0.log" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.370061 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22_19b18d8a-aa0f-494e-9e56-55bceba788c6/prometheus-operator-admission-webhook/0.log" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.410849 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:11:09 crc kubenswrapper[4782]: E0130 20:11:09.411394 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.513143 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pcglc_786ed08c-6b06-4e44-aaf4-5562ef433b88/operator/0.log" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.578973 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w89kr_616c3ea8-075a-475f-9896-180a02e4cc3f/perses-operator/0.log" Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.839588 4782 generic.go:334] "Generic (PLEG): container finished" podID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerID="9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643" exitCode=0 Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.839634 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerDied","Data":"9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643"} Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.839661 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerStarted","Data":"c723a29c911ce487f5a34902217ac86a22eb2392c9326593f86d47f0315ca222"} Jan 30 20:11:09 crc kubenswrapper[4782]: I0130 20:11:09.841856 4782 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 20:11:10 crc kubenswrapper[4782]: I0130 20:11:10.847997 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerStarted","Data":"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034"} Jan 30 20:11:11 crc kubenswrapper[4782]: I0130 20:11:11.856665 4782 generic.go:334] "Generic (PLEG): container finished" podID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerID="311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034" exitCode=0 Jan 30 20:11:11 crc kubenswrapper[4782]: I0130 20:11:11.856730 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerDied","Data":"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034"} Jan 30 20:11:12 crc kubenswrapper[4782]: I0130 20:11:12.868754 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerStarted","Data":"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95"} Jan 30 20:11:12 crc kubenswrapper[4782]: I0130 20:11:12.897687 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rbj5f" podStartSLOduration=2.42197162 podStartE2EDuration="4.897669827s" podCreationTimestamp="2026-01-30 20:11:08 +0000 UTC" firstStartedPulling="2026-01-30 20:11:09.841565399 +0000 UTC m=+6046.109943424" lastFinishedPulling="2026-01-30 20:11:12.317263606 +0000 UTC m=+6048.585641631" observedRunningTime="2026-01-30 20:11:12.892274084 +0000 UTC m=+6049.160652119" watchObservedRunningTime="2026-01-30 20:11:12.897669827 +0000 UTC m=+6049.166047862" Jan 30 20:11:18 crc kubenswrapper[4782]: I0130 20:11:18.691565 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:18 crc kubenswrapper[4782]: I0130 20:11:18.692447 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:18 crc kubenswrapper[4782]: I0130 20:11:18.763862 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:19 crc kubenswrapper[4782]: I0130 20:11:19.008773 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:19 crc kubenswrapper[4782]: I0130 20:11:19.066452 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:20 crc kubenswrapper[4782]: I0130 20:11:20.952300 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rbj5f" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="registry-server" containerID="cri-o://d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95" gracePeriod=2 Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.482750 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.615051 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities\") pod \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.615195 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrrp\" (UniqueName: \"kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp\") pod \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.615338 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content\") pod \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\" (UID: \"671ff40b-ac59-4b34-a865-137ab4a9e0bc\") " Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.615851 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities" (OuterVolumeSpecName: "utilities") pod "671ff40b-ac59-4b34-a865-137ab4a9e0bc" (UID: "671ff40b-ac59-4b34-a865-137ab4a9e0bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.616376 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.624475 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp" (OuterVolumeSpecName: "kube-api-access-vbrrp") pod "671ff40b-ac59-4b34-a865-137ab4a9e0bc" (UID: "671ff40b-ac59-4b34-a865-137ab4a9e0bc"). InnerVolumeSpecName "kube-api-access-vbrrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.720942 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrrp\" (UniqueName: \"kubernetes.io/projected/671ff40b-ac59-4b34-a865-137ab4a9e0bc-kube-api-access-vbrrp\") on node \"crc\" DevicePath \"\"" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.898249 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "671ff40b-ac59-4b34-a865-137ab4a9e0bc" (UID: "671ff40b-ac59-4b34-a865-137ab4a9e0bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.924146 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/671ff40b-ac59-4b34-a865-137ab4a9e0bc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.965857 4782 generic.go:334] "Generic (PLEG): container finished" podID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerID="d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95" exitCode=0 Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.965898 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerDied","Data":"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95"} Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.965943 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rbj5f" event={"ID":"671ff40b-ac59-4b34-a865-137ab4a9e0bc","Type":"ContainerDied","Data":"c723a29c911ce487f5a34902217ac86a22eb2392c9326593f86d47f0315ca222"} Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.965946 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rbj5f" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.965965 4782 scope.go:117] "RemoveContainer" containerID="d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95" Jan 30 20:11:21 crc kubenswrapper[4782]: I0130 20:11:21.991362 4782 scope.go:117] "RemoveContainer" containerID="311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.017016 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.031777 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rbj5f"] Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.032836 4782 scope.go:117] "RemoveContainer" containerID="9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.082627 4782 scope.go:117] "RemoveContainer" containerID="d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95" Jan 30 20:11:22 crc kubenswrapper[4782]: E0130 20:11:22.083081 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95\": container with ID starting with d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95 not found: ID does not exist" containerID="d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.083134 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95"} err="failed to get container status \"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95\": rpc error: code = NotFound desc = could not find container \"d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95\": container with ID starting with d22f29d9c46ad61d9965be6cc437f6cd1a1f6175ab4a69c89d4c192ca1ee7e95 not found: ID does not exist" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.083168 4782 scope.go:117] "RemoveContainer" containerID="311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034" Jan 30 20:11:22 crc kubenswrapper[4782]: E0130 20:11:22.083622 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034\": container with ID starting with 311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034 not found: ID does not exist" containerID="311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.083662 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034"} err="failed to get container status \"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034\": rpc error: code = NotFound desc = could not find container \"311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034\": container with ID starting with 311efc9a1330f2cdd223b3edea42f438b7c5e1a0ac52557e1d33b42c5edce034 not found: ID does not exist" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.083688 4782 scope.go:117] "RemoveContainer" containerID="9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643" Jan 30 20:11:22 crc kubenswrapper[4782]: E0130 20:11:22.083968 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643\": container with ID starting with 9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643 not found: ID does not exist" containerID="9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.083997 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643"} err="failed to get container status \"9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643\": rpc error: code = NotFound desc = could not find container \"9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643\": container with ID starting with 9f1d943d89e764cca6146250fbbcc0a2429ccf6af8bcf6d36f8c6fd9c1d21643 not found: ID does not exist" Jan 30 20:11:22 crc kubenswrapper[4782]: I0130 20:11:22.454706 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" path="/var/lib/kubelet/pods/671ff40b-ac59-4b34-a865-137ab4a9e0bc/volumes" Jan 30 20:11:23 crc kubenswrapper[4782]: I0130 20:11:23.410989 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:11:23 crc kubenswrapper[4782]: E0130 20:11:23.411561 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:11:24 crc kubenswrapper[4782]: I0130 20:11:24.718656 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-nlj8p_b45e2233-e51f-4f71-bc45-cd73fa8302de/kube-rbac-proxy/0.log" Jan 30 20:11:24 crc kubenswrapper[4782]: I0130 20:11:24.851654 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-nlj8p_b45e2233-e51f-4f71-bc45-cd73fa8302de/controller/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.044039 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.263897 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.402763 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.599301 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-m7rqg_67ea755b-acbd-4894-9070-356cb15f18d3/frr-k8s-webhook-server/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.655252 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.784971 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:11:25 crc kubenswrapper[4782]: I0130 20:11:25.966056 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.038085 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.051111 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.067599 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.235931 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-reloader/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.240857 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-frr-files/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.245081 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/controller/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.245989 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/cp-metrics/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.464375 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/frr-metrics/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.499115 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/kube-rbac-proxy-frr/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.516852 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/kube-rbac-proxy/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.704783 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/reloader/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.730744 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-597897467b-d7mjb_a536d77e-78b4-4ec2-a0d2-80e853e186fb/manager/0.log" Jan 30 20:11:26 crc kubenswrapper[4782]: I0130 20:11:26.893018 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bccd97cd9-rmxjs_c696687d-14f1-4f3b-b9ee-36e3845aa7c2/webhook-server/0.log" Jan 30 20:11:27 crc kubenswrapper[4782]: I0130 20:11:27.085327 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jqzpm_824018fe-7708-4c75-aaac-19bfb9f22405/kube-rbac-proxy/0.log" Jan 30 20:11:27 crc kubenswrapper[4782]: I0130 20:11:27.606672 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jqzpm_824018fe-7708-4c75-aaac-19bfb9f22405/speaker/0.log" Jan 30 20:11:27 crc kubenswrapper[4782]: I0130 20:11:27.886356 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wt4wf_1da61d3b-efb6-453e-8e4b-ca98c629c39a/frr/0.log" Jan 30 20:11:34 crc kubenswrapper[4782]: I0130 20:11:34.420833 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:11:34 crc kubenswrapper[4782]: E0130 20:11:34.421615 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:11:37 crc kubenswrapper[4782]: I0130 20:11:37.231008 4782 scope.go:117] "RemoveContainer" containerID="fc6896929383b3309d8116f2b21f3f16748ffbe68ada600974acdf67e23683d5" Jan 30 20:11:37 crc kubenswrapper[4782]: I0130 20:11:37.251410 4782 scope.go:117] "RemoveContainer" containerID="32007611f1facec0316bcded60962382542ed69aa85eaec5b977b7b8da960e76" Jan 30 20:11:37 crc kubenswrapper[4782]: I0130 20:11:37.273338 4782 scope.go:117] "RemoveContainer" containerID="c4ca8a0053d28cd95181d22f4cc4415c1d11ef64b7de41dbd9b1c3cb55609511" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.332271 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.501711 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.538450 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.580436 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.726021 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/util/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.764315 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/pull/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.768174 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcc8d6m_7e835e31-1014-43e6-8bb6-34ad5fa00bab/extract/0.log" Jan 30 20:11:42 crc kubenswrapper[4782]: I0130 20:11:42.906419 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.095492 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.137380 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.139970 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.305818 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.337661 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/extract/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.351214 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713br5nn_104efc3a-1dff-4e45-8448-ea03ec78e23f/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.487991 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.710186 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.770929 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.775284 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.902599 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/util/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.920483 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/pull/0.log" Jan 30 20:11:43 crc kubenswrapper[4782]: I0130 20:11:43.956027 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08z9jnm_daebabea-1b8e-4b77-9ea1-7cd8c7270caf/extract/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.080465 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.254819 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.268655 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.279364 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.709894 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-utilities/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.723568 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/extract-content/0.log" Jan 30 20:11:44 crc kubenswrapper[4782]: I0130 20:11:44.973188 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.141859 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.206304 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.234312 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.270580 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jvlds_858d4185-257a-4486-8147-63381dd9a8f6/registry-server/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.350344 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-utilities/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.389025 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/extract-content/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.567830 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qsfzf_5d993e9b-840e-4235-9d1e-9d2cf1928afc/marketplace-operator/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.729112 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.941458 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:11:45 crc kubenswrapper[4782]: I0130 20:11:45.974644 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.004420 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.184442 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jklvb_8a2c42a5-1f55-4d42-8696-d59384fa426f/registry-server/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.210281 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-content/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.237579 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/extract-utilities/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.404519 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.412794 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-t6s5g_b8269608-e848-472d-a953-8d8b3a3418e2/registry-server/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.578588 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.579303 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.585183 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.676676 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:11:46 crc kubenswrapper[4782]: E0130 20:11:46.677888 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="extract-content" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.677923 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="extract-content" Jan 30 20:11:46 crc kubenswrapper[4782]: E0130 20:11:46.677960 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="extract-utilities" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.677969 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="extract-utilities" Jan 30 20:11:46 crc kubenswrapper[4782]: E0130 20:11:46.677995 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="registry-server" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.678002 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="registry-server" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.678175 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ff40b-ac59-4b34-a865-137ab4a9e0bc" containerName="registry-server" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.679670 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.686029 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6b5\" (UniqueName: \"kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.686103 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.686187 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.691049 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.764035 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-content/0.log" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.787332 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.787460 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6b5\" (UniqueName: \"kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.787527 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.787935 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.787973 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.805931 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6b5\" (UniqueName: \"kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5\") pod \"certified-operators-vzh49\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:46 crc kubenswrapper[4782]: I0130 20:11:46.842849 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/extract-utilities/0.log" Jan 30 20:11:47 crc kubenswrapper[4782]: I0130 20:11:47.024635 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:47 crc kubenswrapper[4782]: I0130 20:11:47.483448 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f7ztk_a6093f28-21a4-43ed-873f-4be71c22abfe/registry-server/0.log" Jan 30 20:11:47 crc kubenswrapper[4782]: I0130 20:11:47.606131 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:11:48 crc kubenswrapper[4782]: I0130 20:11:48.413957 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:11:48 crc kubenswrapper[4782]: E0130 20:11:48.414634 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:11:48 crc kubenswrapper[4782]: I0130 20:11:48.440546 4782 generic.go:334] "Generic (PLEG): container finished" podID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerID="060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac" exitCode=0 Jan 30 20:11:48 crc kubenswrapper[4782]: I0130 20:11:48.440587 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerDied","Data":"060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac"} Jan 30 20:11:48 crc kubenswrapper[4782]: I0130 20:11:48.440614 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerStarted","Data":"a55e05450596516b697039930d3f44cac292de6bf115b9f741d83d5c18c4f1ba"} Jan 30 20:11:49 crc kubenswrapper[4782]: I0130 20:11:49.450175 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerStarted","Data":"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc"} Jan 30 20:11:50 crc kubenswrapper[4782]: I0130 20:11:50.470161 4782 generic.go:334] "Generic (PLEG): container finished" podID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerID="4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc" exitCode=0 Jan 30 20:11:50 crc kubenswrapper[4782]: I0130 20:11:50.470248 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerDied","Data":"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc"} Jan 30 20:11:51 crc kubenswrapper[4782]: I0130 20:11:51.482094 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerStarted","Data":"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d"} Jan 30 20:11:51 crc kubenswrapper[4782]: I0130 20:11:51.512370 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzh49" podStartSLOduration=2.962490926 podStartE2EDuration="5.512341474s" podCreationTimestamp="2026-01-30 20:11:46 +0000 UTC" firstStartedPulling="2026-01-30 20:11:48.443112832 +0000 UTC m=+6084.711490857" lastFinishedPulling="2026-01-30 20:11:50.99296338 +0000 UTC m=+6087.261341405" observedRunningTime="2026-01-30 20:11:51.500450681 +0000 UTC m=+6087.768828866" watchObservedRunningTime="2026-01-30 20:11:51.512341474 +0000 UTC m=+6087.780719509" Jan 30 20:11:57 crc kubenswrapper[4782]: I0130 20:11:57.026155 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:57 crc kubenswrapper[4782]: I0130 20:11:57.026787 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:57 crc kubenswrapper[4782]: I0130 20:11:57.081409 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:57 crc kubenswrapper[4782]: I0130 20:11:57.598543 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:11:57 crc kubenswrapper[4782]: I0130 20:11:57.654657 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:11:59 crc kubenswrapper[4782]: I0130 20:11:59.564360 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzh49" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="registry-server" containerID="cri-o://6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d" gracePeriod=2 Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.141251 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.238993 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft6b5\" (UniqueName: \"kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5\") pod \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.239497 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content\") pod \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.239554 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities\") pod \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\" (UID: \"ee2b68c1-8e49-4716-9360-0152a9d5d9d2\") " Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.240991 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities" (OuterVolumeSpecName: "utilities") pod "ee2b68c1-8e49-4716-9360-0152a9d5d9d2" (UID: "ee2b68c1-8e49-4716-9360-0152a9d5d9d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.252867 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5" (OuterVolumeSpecName: "kube-api-access-ft6b5") pod "ee2b68c1-8e49-4716-9360-0152a9d5d9d2" (UID: "ee2b68c1-8e49-4716-9360-0152a9d5d9d2"). InnerVolumeSpecName "kube-api-access-ft6b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.299006 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee2b68c1-8e49-4716-9360-0152a9d5d9d2" (UID: "ee2b68c1-8e49-4716-9360-0152a9d5d9d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.342806 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft6b5\" (UniqueName: \"kubernetes.io/projected/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-kube-api-access-ft6b5\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.342850 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.342863 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2b68c1-8e49-4716-9360-0152a9d5d9d2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.419442 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:12:00 crc kubenswrapper[4782]: E0130 20:12:00.419802 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.581184 4782 generic.go:334] "Generic (PLEG): container finished" podID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerID="6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d" exitCode=0 Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.581239 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerDied","Data":"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d"} Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.581267 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzh49" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.581283 4782 scope.go:117] "RemoveContainer" containerID="6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.581273 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzh49" event={"ID":"ee2b68c1-8e49-4716-9360-0152a9d5d9d2","Type":"ContainerDied","Data":"a55e05450596516b697039930d3f44cac292de6bf115b9f741d83d5c18c4f1ba"} Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.612795 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.612939 4782 scope.go:117] "RemoveContainer" containerID="4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.638566 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzh49"] Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.653397 4782 scope.go:117] "RemoveContainer" containerID="060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.678129 4782 scope.go:117] "RemoveContainer" containerID="6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d" Jan 30 20:12:00 crc kubenswrapper[4782]: E0130 20:12:00.678532 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d\": container with ID starting with 6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d not found: ID does not exist" containerID="6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.678566 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d"} err="failed to get container status \"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d\": rpc error: code = NotFound desc = could not find container \"6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d\": container with ID starting with 6266b7dd2c7fe314f006bce58e9c1138471b811a90ac140e2bc6e86dfb01e08d not found: ID does not exist" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.678586 4782 scope.go:117] "RemoveContainer" containerID="4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc" Jan 30 20:12:00 crc kubenswrapper[4782]: E0130 20:12:00.678892 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc\": container with ID starting with 4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc not found: ID does not exist" containerID="4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.678916 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc"} err="failed to get container status \"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc\": rpc error: code = NotFound desc = could not find container \"4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc\": container with ID starting with 4fdeb6128a5ec6c81b97d49763e1379ca853d1bf98120b42f22a3d8d9d50f2bc not found: ID does not exist" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.678930 4782 scope.go:117] "RemoveContainer" containerID="060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac" Jan 30 20:12:00 crc kubenswrapper[4782]: E0130 20:12:00.679273 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac\": container with ID starting with 060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac not found: ID does not exist" containerID="060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac" Jan 30 20:12:00 crc kubenswrapper[4782]: I0130 20:12:00.679327 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac"} err="failed to get container status \"060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac\": rpc error: code = NotFound desc = could not find container \"060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac\": container with ID starting with 060884a33fa2ac566ba47269d404e791118c3bedca1d82d4c9a7e834cfdb81ac not found: ID does not exist" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.283087 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-6tvwp_05098fbb-e910-4fec-8a31-fd98d476b941/prometheus-operator-admission-webhook/0.log" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.302683 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kmgbq_92e82803-8b7d-46f3-ba40-2900590261cf/prometheus-operator/0.log" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.389033 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-dd46f96f9-f2l22_19b18d8a-aa0f-494e-9e56-55bceba788c6/prometheus-operator-admission-webhook/0.log" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.421823 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" path="/var/lib/kubelet/pods/ee2b68c1-8e49-4716-9360-0152a9d5d9d2/volumes" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.486356 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-pcglc_786ed08c-6b06-4e44-aaf4-5562ef433b88/operator/0.log" Jan 30 20:12:02 crc kubenswrapper[4782]: I0130 20:12:02.553068 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-w89kr_616c3ea8-075a-475f-9896-180a02e4cc3f/perses-operator/0.log" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.150150 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:08 crc kubenswrapper[4782]: E0130 20:12:08.151116 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="registry-server" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.151129 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="registry-server" Jan 30 20:12:08 crc kubenswrapper[4782]: E0130 20:12:08.151146 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="extract-utilities" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.151152 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="extract-utilities" Jan 30 20:12:08 crc kubenswrapper[4782]: E0130 20:12:08.151169 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="extract-content" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.151177 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="extract-content" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.151393 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2b68c1-8e49-4716-9360-0152a9d5d9d2" containerName="registry-server" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.154030 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.178859 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.233407 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.233445 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.233526 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrc6\" (UniqueName: \"kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.336522 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrc6\" (UniqueName: \"kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.336667 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.336684 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.337193 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.337755 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.383163 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrc6\" (UniqueName: \"kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6\") pod \"redhat-operators-kcdcn\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:08 crc kubenswrapper[4782]: I0130 20:12:08.480703 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:09 crc kubenswrapper[4782]: I0130 20:12:09.129022 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:09 crc kubenswrapper[4782]: I0130 20:12:09.704115 4782 generic.go:334] "Generic (PLEG): container finished" podID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerID="060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6" exitCode=0 Jan 30 20:12:09 crc kubenswrapper[4782]: I0130 20:12:09.704504 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerDied","Data":"060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6"} Jan 30 20:12:09 crc kubenswrapper[4782]: I0130 20:12:09.704533 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerStarted","Data":"680578218e3274dc151c790bcf868b4393c7c03a0d61ee2ba2bbac5b43d9b2ab"} Jan 30 20:12:11 crc kubenswrapper[4782]: E0130 20:12:11.199482 4782 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.212:45274->38.102.83.212:36463: write tcp 38.102.83.212:45274->38.102.83.212:36463: write: broken pipe Jan 30 20:12:11 crc kubenswrapper[4782]: I0130 20:12:11.724937 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerStarted","Data":"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb"} Jan 30 20:12:12 crc kubenswrapper[4782]: I0130 20:12:12.411164 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:12:12 crc kubenswrapper[4782]: E0130 20:12:12.411997 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:12:16 crc kubenswrapper[4782]: I0130 20:12:16.776878 4782 generic.go:334] "Generic (PLEG): container finished" podID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerID="05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb" exitCode=0 Jan 30 20:12:16 crc kubenswrapper[4782]: I0130 20:12:16.776989 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerDied","Data":"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb"} Jan 30 20:12:17 crc kubenswrapper[4782]: I0130 20:12:17.787759 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerStarted","Data":"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5"} Jan 30 20:12:17 crc kubenswrapper[4782]: I0130 20:12:17.813593 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kcdcn" podStartSLOduration=2.318695814 podStartE2EDuration="9.81357255s" podCreationTimestamp="2026-01-30 20:12:08 +0000 UTC" firstStartedPulling="2026-01-30 20:12:09.706328414 +0000 UTC m=+6105.974706439" lastFinishedPulling="2026-01-30 20:12:17.20120515 +0000 UTC m=+6113.469583175" observedRunningTime="2026-01-30 20:12:17.812806121 +0000 UTC m=+6114.081184156" watchObservedRunningTime="2026-01-30 20:12:17.81357255 +0000 UTC m=+6114.081950575" Jan 30 20:12:18 crc kubenswrapper[4782]: I0130 20:12:18.481546 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:18 crc kubenswrapper[4782]: I0130 20:12:18.481800 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:19 crc kubenswrapper[4782]: I0130 20:12:19.531687 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcdcn" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" probeResult="failure" output=< Jan 30 20:12:19 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 20:12:19 crc kubenswrapper[4782]: > Jan 30 20:12:25 crc kubenswrapper[4782]: I0130 20:12:25.412284 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:12:25 crc kubenswrapper[4782]: E0130 20:12:25.413101 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:12:29 crc kubenswrapper[4782]: I0130 20:12:29.554189 4782 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kcdcn" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" probeResult="failure" output=< Jan 30 20:12:29 crc kubenswrapper[4782]: timeout: failed to connect service ":50051" within 1s Jan 30 20:12:29 crc kubenswrapper[4782]: > Jan 30 20:12:38 crc kubenswrapper[4782]: I0130 20:12:38.542342 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:38 crc kubenswrapper[4782]: I0130 20:12:38.628764 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:39 crc kubenswrapper[4782]: I0130 20:12:39.355064 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.008498 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kcdcn" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" containerID="cri-o://f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5" gracePeriod=2 Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.412569 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:12:40 crc kubenswrapper[4782]: E0130 20:12:40.413378 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.510857 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.556015 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfrc6\" (UniqueName: \"kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6\") pod \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.556152 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities\") pod \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.556306 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content\") pod \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\" (UID: \"a8e7bbca-5fbc-45ef-938a-07edff6d58f3\") " Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.558038 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities" (OuterVolumeSpecName: "utilities") pod "a8e7bbca-5fbc-45ef-938a-07edff6d58f3" (UID: "a8e7bbca-5fbc-45ef-938a-07edff6d58f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.565527 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6" (OuterVolumeSpecName: "kube-api-access-tfrc6") pod "a8e7bbca-5fbc-45ef-938a-07edff6d58f3" (UID: "a8e7bbca-5fbc-45ef-938a-07edff6d58f3"). InnerVolumeSpecName "kube-api-access-tfrc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.660886 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfrc6\" (UniqueName: \"kubernetes.io/projected/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-kube-api-access-tfrc6\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.662607 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.720413 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8e7bbca-5fbc-45ef-938a-07edff6d58f3" (UID: "a8e7bbca-5fbc-45ef-938a-07edff6d58f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:12:40 crc kubenswrapper[4782]: I0130 20:12:40.764987 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8e7bbca-5fbc-45ef-938a-07edff6d58f3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.038916 4782 generic.go:334] "Generic (PLEG): container finished" podID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerID="f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5" exitCode=0 Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.038987 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerDied","Data":"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5"} Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.039021 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kcdcn" event={"ID":"a8e7bbca-5fbc-45ef-938a-07edff6d58f3","Type":"ContainerDied","Data":"680578218e3274dc151c790bcf868b4393c7c03a0d61ee2ba2bbac5b43d9b2ab"} Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.039042 4782 scope.go:117] "RemoveContainer" containerID="f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.039196 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kcdcn" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.090732 4782 scope.go:117] "RemoveContainer" containerID="05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.098301 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.118036 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kcdcn"] Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.124372 4782 scope.go:117] "RemoveContainer" containerID="060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.169552 4782 scope.go:117] "RemoveContainer" containerID="f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5" Jan 30 20:12:41 crc kubenswrapper[4782]: E0130 20:12:41.170131 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5\": container with ID starting with f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5 not found: ID does not exist" containerID="f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.170192 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5"} err="failed to get container status \"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5\": rpc error: code = NotFound desc = could not find container \"f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5\": container with ID starting with f8e41abbb1f1712fe6da0abbb6836ddba96429799add6249d75cde7c6911d5e5 not found: ID does not exist" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.170267 4782 scope.go:117] "RemoveContainer" containerID="05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb" Jan 30 20:12:41 crc kubenswrapper[4782]: E0130 20:12:41.170715 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb\": container with ID starting with 05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb not found: ID does not exist" containerID="05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.170765 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb"} err="failed to get container status \"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb\": rpc error: code = NotFound desc = could not find container \"05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb\": container with ID starting with 05f257e26dbbdffef390414ceb09c89d5bd9e6f31ccc30d3ea7ef8b4e8cafabb not found: ID does not exist" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.170802 4782 scope.go:117] "RemoveContainer" containerID="060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6" Jan 30 20:12:41 crc kubenswrapper[4782]: E0130 20:12:41.171123 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6\": container with ID starting with 060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6 not found: ID does not exist" containerID="060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6" Jan 30 20:12:41 crc kubenswrapper[4782]: I0130 20:12:41.171150 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6"} err="failed to get container status \"060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6\": rpc error: code = NotFound desc = could not find container \"060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6\": container with ID starting with 060457e9da1caf8181270f5344ce9c6603ffe3dfce0322e7eafd1686e2d9e9e6 not found: ID does not exist" Jan 30 20:12:42 crc kubenswrapper[4782]: I0130 20:12:42.433288 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" path="/var/lib/kubelet/pods/a8e7bbca-5fbc-45ef-938a-07edff6d58f3/volumes" Jan 30 20:12:53 crc kubenswrapper[4782]: I0130 20:12:53.411090 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:12:53 crc kubenswrapper[4782]: E0130 20:12:53.411984 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:13:07 crc kubenswrapper[4782]: I0130 20:13:07.410841 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:13:07 crc kubenswrapper[4782]: E0130 20:13:07.411934 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:13:22 crc kubenswrapper[4782]: I0130 20:13:22.411725 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:13:22 crc kubenswrapper[4782]: E0130 20:13:22.414325 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:13:33 crc kubenswrapper[4782]: I0130 20:13:33.412090 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:13:33 crc kubenswrapper[4782]: E0130 20:13:33.412867 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:13:45 crc kubenswrapper[4782]: I0130 20:13:45.411113 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:13:45 crc kubenswrapper[4782]: E0130 20:13:45.411807 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:13:59 crc kubenswrapper[4782]: I0130 20:13:59.411747 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:13:59 crc kubenswrapper[4782]: E0130 20:13:59.412823 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:14:07 crc kubenswrapper[4782]: I0130 20:14:07.065397 4782 generic.go:334] "Generic (PLEG): container finished" podID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerID="86f473b437bf96416d0ab246050d7b1fe8044ecc55ee521f0908d36a6d275722" exitCode=0 Jan 30 20:14:07 crc kubenswrapper[4782]: I0130 20:14:07.065475 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" event={"ID":"f498c4c6-fa9e-4e2c-a41d-217b45043e07","Type":"ContainerDied","Data":"86f473b437bf96416d0ab246050d7b1fe8044ecc55ee521f0908d36a6d275722"} Jan 30 20:14:07 crc kubenswrapper[4782]: I0130 20:14:07.066196 4782 scope.go:117] "RemoveContainer" containerID="86f473b437bf96416d0ab246050d7b1fe8044ecc55ee521f0908d36a6d275722" Jan 30 20:14:07 crc kubenswrapper[4782]: I0130 20:14:07.692442 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5bgg_must-gather-2nzsh_f498c4c6-fa9e-4e2c-a41d-217b45043e07/gather/0.log" Jan 30 20:14:10 crc kubenswrapper[4782]: I0130 20:14:10.411521 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:14:10 crc kubenswrapper[4782]: E0130 20:14:10.412814 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:14:18 crc kubenswrapper[4782]: I0130 20:14:18.904309 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-w5bgg/must-gather-2nzsh"] Jan 30 20:14:18 crc kubenswrapper[4782]: I0130 20:14:18.905354 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="copy" containerID="cri-o://8583b5ca2cc0a5e1a2db33736d987066b50218486f716cfb75d0aa895c94e5bd" gracePeriod=2 Jan 30 20:14:18 crc kubenswrapper[4782]: I0130 20:14:18.916735 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-w5bgg/must-gather-2nzsh"] Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.217911 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5bgg_must-gather-2nzsh_f498c4c6-fa9e-4e2c-a41d-217b45043e07/copy/0.log" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.218255 4782 generic.go:334] "Generic (PLEG): container finished" podID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerID="8583b5ca2cc0a5e1a2db33736d987066b50218486f716cfb75d0aa895c94e5bd" exitCode=143 Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.367572 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5bgg_must-gather-2nzsh_f498c4c6-fa9e-4e2c-a41d-217b45043e07/copy/0.log" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.368011 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.489442 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2znf2\" (UniqueName: \"kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2\") pod \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.489836 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output\") pod \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\" (UID: \"f498c4c6-fa9e-4e2c-a41d-217b45043e07\") " Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.501463 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2" (OuterVolumeSpecName: "kube-api-access-2znf2") pod "f498c4c6-fa9e-4e2c-a41d-217b45043e07" (UID: "f498c4c6-fa9e-4e2c-a41d-217b45043e07"). InnerVolumeSpecName "kube-api-access-2znf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.592419 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2znf2\" (UniqueName: \"kubernetes.io/projected/f498c4c6-fa9e-4e2c-a41d-217b45043e07-kube-api-access-2znf2\") on node \"crc\" DevicePath \"\"" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.684188 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f498c4c6-fa9e-4e2c-a41d-217b45043e07" (UID: "f498c4c6-fa9e-4e2c-a41d-217b45043e07"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:14:19 crc kubenswrapper[4782]: I0130 20:14:19.695492 4782 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f498c4c6-fa9e-4e2c-a41d-217b45043e07-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 20:14:20 crc kubenswrapper[4782]: I0130 20:14:20.231258 4782 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-w5bgg_must-gather-2nzsh_f498c4c6-fa9e-4e2c-a41d-217b45043e07/copy/0.log" Jan 30 20:14:20 crc kubenswrapper[4782]: I0130 20:14:20.231933 4782 scope.go:117] "RemoveContainer" containerID="8583b5ca2cc0a5e1a2db33736d987066b50218486f716cfb75d0aa895c94e5bd" Jan 30 20:14:20 crc kubenswrapper[4782]: I0130 20:14:20.232119 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w5bgg/must-gather-2nzsh" Jan 30 20:14:20 crc kubenswrapper[4782]: I0130 20:14:20.258429 4782 scope.go:117] "RemoveContainer" containerID="86f473b437bf96416d0ab246050d7b1fe8044ecc55ee521f0908d36a6d275722" Jan 30 20:14:20 crc kubenswrapper[4782]: I0130 20:14:20.421250 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" path="/var/lib/kubelet/pods/f498c4c6-fa9e-4e2c-a41d-217b45043e07/volumes" Jan 30 20:14:25 crc kubenswrapper[4782]: I0130 20:14:25.412991 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:14:25 crc kubenswrapper[4782]: E0130 20:14:25.414304 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:14:37 crc kubenswrapper[4782]: I0130 20:14:37.410835 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:14:37 crc kubenswrapper[4782]: E0130 20:14:37.411761 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:14:37 crc kubenswrapper[4782]: I0130 20:14:37.490016 4782 scope.go:117] "RemoveContainer" containerID="326d471868d4b9fd74604eba71257fc90157f87c7ce5b10f6736b49bea4085a3" Jan 30 20:14:37 crc kubenswrapper[4782]: I0130 20:14:37.526117 4782 scope.go:117] "RemoveContainer" containerID="6172c80be7d78138de838feea6af5a3710f62b0f558b53ea22defa907249b1a9" Jan 30 20:14:51 crc kubenswrapper[4782]: I0130 20:14:51.411035 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:14:51 crc kubenswrapper[4782]: E0130 20:14:51.411855 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.167896 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl"] Jan 30 20:15:00 crc kubenswrapper[4782]: E0130 20:15:00.169160 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="extract-utilities" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169184 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="extract-utilities" Jan 30 20:15:00 crc kubenswrapper[4782]: E0130 20:15:00.169260 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="gather" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169274 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="gather" Jan 30 20:15:00 crc kubenswrapper[4782]: E0130 20:15:00.169312 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="copy" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169325 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="copy" Jan 30 20:15:00 crc kubenswrapper[4782]: E0130 20:15:00.169347 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="extract-content" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169359 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="extract-content" Jan 30 20:15:00 crc kubenswrapper[4782]: E0130 20:15:00.169392 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169403 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169732 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="gather" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169775 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="f498c4c6-fa9e-4e2c-a41d-217b45043e07" containerName="copy" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.169795 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e7bbca-5fbc-45ef-938a-07edff6d58f3" containerName="registry-server" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.170965 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.176328 4782 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.176568 4782 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.199069 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl"] Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.209306 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.209743 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.209945 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnplw\" (UniqueName: \"kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.312445 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.312571 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.312599 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnplw\" (UniqueName: \"kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.313614 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.329926 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.332822 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnplw\" (UniqueName: \"kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw\") pod \"collect-profiles-29496735-72xhl\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:00 crc kubenswrapper[4782]: I0130 20:15:00.512723 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:01 crc kubenswrapper[4782]: I0130 20:15:01.028195 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl"] Jan 30 20:15:01 crc kubenswrapper[4782]: I0130 20:15:01.705681 4782 generic.go:334] "Generic (PLEG): container finished" podID="2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" containerID="393c8bf09f9110f2a717a25c0db92bab543533d273f3fed2a8df5a0ddf230999" exitCode=0 Jan 30 20:15:01 crc kubenswrapper[4782]: I0130 20:15:01.705912 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" event={"ID":"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc","Type":"ContainerDied","Data":"393c8bf09f9110f2a717a25c0db92bab543533d273f3fed2a8df5a0ddf230999"} Jan 30 20:15:01 crc kubenswrapper[4782]: I0130 20:15:01.705984 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" event={"ID":"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc","Type":"ContainerStarted","Data":"4283db99a10e9de1ea2b184ee9948002e1fb80d2d3babc941bf6ba3df56cd26d"} Jan 30 20:15:02 crc kubenswrapper[4782]: I0130 20:15:02.411385 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:15:02 crc kubenswrapper[4782]: E0130 20:15:02.412057 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.166223 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.276960 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume\") pod \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.277335 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume\") pod \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.277470 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnplw\" (UniqueName: \"kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw\") pod \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\" (UID: \"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc\") " Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.278085 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" (UID: "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.278831 4782 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.285503 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" (UID: "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.285564 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw" (OuterVolumeSpecName: "kube-api-access-wnplw") pod "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" (UID: "2ce73ea1-6b5c-4d82-bc20-26ad88705dfc"). InnerVolumeSpecName "kube-api-access-wnplw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.381168 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnplw\" (UniqueName: \"kubernetes.io/projected/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-kube-api-access-wnplw\") on node \"crc\" DevicePath \"\"" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.381239 4782 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ce73ea1-6b5c-4d82-bc20-26ad88705dfc-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.729053 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" event={"ID":"2ce73ea1-6b5c-4d82-bc20-26ad88705dfc","Type":"ContainerDied","Data":"4283db99a10e9de1ea2b184ee9948002e1fb80d2d3babc941bf6ba3df56cd26d"} Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.729097 4782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4283db99a10e9de1ea2b184ee9948002e1fb80d2d3babc941bf6ba3df56cd26d" Jan 30 20:15:03 crc kubenswrapper[4782]: I0130 20:15:03.729155 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496735-72xhl" Jan 30 20:15:04 crc kubenswrapper[4782]: I0130 20:15:04.249065 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5"] Jan 30 20:15:04 crc kubenswrapper[4782]: I0130 20:15:04.259451 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496690-7cpl5"] Jan 30 20:15:04 crc kubenswrapper[4782]: I0130 20:15:04.435143 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b24fd70-c832-4124-b936-e73e54e41b38" path="/var/lib/kubelet/pods/0b24fd70-c832-4124-b936-e73e54e41b38/volumes" Jan 30 20:15:15 crc kubenswrapper[4782]: I0130 20:15:15.411540 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:15:15 crc kubenswrapper[4782]: E0130 20:15:15.412612 4782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7zdh_openshift-machine-config-operator(5eeb02b9-cc00-423a-87f6-2c326af45ceb)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" podUID="5eeb02b9-cc00-423a-87f6-2c326af45ceb" Jan 30 20:15:29 crc kubenswrapper[4782]: I0130 20:15:29.411797 4782 scope.go:117] "RemoveContainer" containerID="fe8ac3b148f66db8327f5c7a67c68ce2d43dbfd79ee373437658f5e6f2d75b04" Jan 30 20:15:30 crc kubenswrapper[4782]: I0130 20:15:30.026992 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7zdh" event={"ID":"5eeb02b9-cc00-423a-87f6-2c326af45ceb","Type":"ContainerStarted","Data":"745283571e6d33849d0a04b04d71e1e2b2b6174d4d67bcc718f23dc90fc44ea9"} Jan 30 20:15:37 crc kubenswrapper[4782]: I0130 20:15:37.686081 4782 scope.go:117] "RemoveContainer" containerID="ceb68721229ea238cf40a1500fff86d172f7bf2a6c5ee94080fbca1caf3d1f11" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.821449 4782 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:15:47 crc kubenswrapper[4782]: E0130 20:15:47.823449 4782 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" containerName="collect-profiles" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.823485 4782 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" containerName="collect-profiles" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.823969 4782 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce73ea1-6b5c-4d82-bc20-26ad88705dfc" containerName="collect-profiles" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.827700 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.838141 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.999780 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9ldp\" (UniqueName: \"kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.999862 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:47 crc kubenswrapper[4782]: I0130 20:15:47.999919 4782 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.102097 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9ldp\" (UniqueName: \"kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.102171 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.102211 4782 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.103014 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.103159 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.131334 4782 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9ldp\" (UniqueName: \"kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp\") pod \"community-operators-46vjf\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.180026 4782 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:48 crc kubenswrapper[4782]: W0130 20:15:48.724429 4782 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a30d42_81e9_4b55_922c_beb500f9b4ee.slice/crio-74cc6d50cbf518bfbb331e71332fbb046c119b84b757df88c09dfa14f069ecf9 WatchSource:0}: Error finding container 74cc6d50cbf518bfbb331e71332fbb046c119b84b757df88c09dfa14f069ecf9: Status 404 returned error can't find the container with id 74cc6d50cbf518bfbb331e71332fbb046c119b84b757df88c09dfa14f069ecf9 Jan 30 20:15:48 crc kubenswrapper[4782]: I0130 20:15:48.727594 4782 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:15:49 crc kubenswrapper[4782]: I0130 20:15:49.280448 4782 generic.go:334] "Generic (PLEG): container finished" podID="b5a30d42-81e9-4b55-922c-beb500f9b4ee" containerID="e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7" exitCode=0 Jan 30 20:15:49 crc kubenswrapper[4782]: I0130 20:15:49.280516 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerDied","Data":"e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7"} Jan 30 20:15:49 crc kubenswrapper[4782]: I0130 20:15:49.281200 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerStarted","Data":"74cc6d50cbf518bfbb331e71332fbb046c119b84b757df88c09dfa14f069ecf9"} Jan 30 20:15:51 crc kubenswrapper[4782]: I0130 20:15:51.308033 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerStarted","Data":"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942"} Jan 30 20:15:52 crc kubenswrapper[4782]: I0130 20:15:52.319737 4782 generic.go:334] "Generic (PLEG): container finished" podID="b5a30d42-81e9-4b55-922c-beb500f9b4ee" containerID="1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942" exitCode=0 Jan 30 20:15:52 crc kubenswrapper[4782]: I0130 20:15:52.319774 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerDied","Data":"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942"} Jan 30 20:15:53 crc kubenswrapper[4782]: I0130 20:15:53.333633 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerStarted","Data":"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0"} Jan 30 20:15:53 crc kubenswrapper[4782]: I0130 20:15:53.363207 4782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-46vjf" podStartSLOduration=2.9185138630000003 podStartE2EDuration="6.363184602s" podCreationTimestamp="2026-01-30 20:15:47 +0000 UTC" firstStartedPulling="2026-01-30 20:15:49.286777184 +0000 UTC m=+6325.555155229" lastFinishedPulling="2026-01-30 20:15:52.731447903 +0000 UTC m=+6328.999825968" observedRunningTime="2026-01-30 20:15:53.351830362 +0000 UTC m=+6329.620208407" watchObservedRunningTime="2026-01-30 20:15:53.363184602 +0000 UTC m=+6329.631562637" Jan 30 20:15:58 crc kubenswrapper[4782]: I0130 20:15:58.180393 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:58 crc kubenswrapper[4782]: I0130 20:15:58.180819 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:58 crc kubenswrapper[4782]: I0130 20:15:58.274479 4782 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:58 crc kubenswrapper[4782]: I0130 20:15:58.476636 4782 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:15:58 crc kubenswrapper[4782]: I0130 20:15:58.531885 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:16:00 crc kubenswrapper[4782]: I0130 20:16:00.428902 4782 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-46vjf" podUID="b5a30d42-81e9-4b55-922c-beb500f9b4ee" containerName="registry-server" containerID="cri-o://34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0" gracePeriod=2 Jan 30 20:16:00 crc kubenswrapper[4782]: I0130 20:16:00.930468 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.012643 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9ldp\" (UniqueName: \"kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp\") pod \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.013199 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content\") pod \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.013327 4782 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities\") pod \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\" (UID: \"b5a30d42-81e9-4b55-922c-beb500f9b4ee\") " Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.014731 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities" (OuterVolumeSpecName: "utilities") pod "b5a30d42-81e9-4b55-922c-beb500f9b4ee" (UID: "b5a30d42-81e9-4b55-922c-beb500f9b4ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.027586 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp" (OuterVolumeSpecName: "kube-api-access-n9ldp") pod "b5a30d42-81e9-4b55-922c-beb500f9b4ee" (UID: "b5a30d42-81e9-4b55-922c-beb500f9b4ee"). InnerVolumeSpecName "kube-api-access-n9ldp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.115408 4782 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.115439 4782 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9ldp\" (UniqueName: \"kubernetes.io/projected/b5a30d42-81e9-4b55-922c-beb500f9b4ee-kube-api-access-n9ldp\") on node \"crc\" DevicePath \"\"" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.458796 4782 generic.go:334] "Generic (PLEG): container finished" podID="b5a30d42-81e9-4b55-922c-beb500f9b4ee" containerID="34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0" exitCode=0 Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.458843 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerDied","Data":"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0"} Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.458873 4782 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46vjf" event={"ID":"b5a30d42-81e9-4b55-922c-beb500f9b4ee","Type":"ContainerDied","Data":"74cc6d50cbf518bfbb331e71332fbb046c119b84b757df88c09dfa14f069ecf9"} Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.458874 4782 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46vjf" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.458949 4782 scope.go:117] "RemoveContainer" containerID="34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.483248 4782 scope.go:117] "RemoveContainer" containerID="1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.484532 4782 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5a30d42-81e9-4b55-922c-beb500f9b4ee" (UID: "b5a30d42-81e9-4b55-922c-beb500f9b4ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.508017 4782 scope.go:117] "RemoveContainer" containerID="e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.547966 4782 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a30d42-81e9-4b55-922c-beb500f9b4ee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.575318 4782 scope.go:117] "RemoveContainer" containerID="34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0" Jan 30 20:16:01 crc kubenswrapper[4782]: E0130 20:16:01.575803 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0\": container with ID starting with 34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0 not found: ID does not exist" containerID="34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.575866 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0"} err="failed to get container status \"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0\": rpc error: code = NotFound desc = could not find container \"34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0\": container with ID starting with 34a1a7a30e4411cb3a3899138917e0e6c270791efdff9ae9c171a9a02a06cdf0 not found: ID does not exist" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.575899 4782 scope.go:117] "RemoveContainer" containerID="1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942" Jan 30 20:16:01 crc kubenswrapper[4782]: E0130 20:16:01.576552 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942\": container with ID starting with 1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942 not found: ID does not exist" containerID="1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.576599 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942"} err="failed to get container status \"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942\": rpc error: code = NotFound desc = could not find container \"1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942\": container with ID starting with 1667c157a63ce3679e43b2edbdb950a449fb3248e2861391f5cd19d12ad38942 not found: ID does not exist" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.576629 4782 scope.go:117] "RemoveContainer" containerID="e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7" Jan 30 20:16:01 crc kubenswrapper[4782]: E0130 20:16:01.576906 4782 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7\": container with ID starting with e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7 not found: ID does not exist" containerID="e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.576941 4782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7"} err="failed to get container status \"e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7\": rpc error: code = NotFound desc = could not find container \"e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7\": container with ID starting with e843cd3072c0060f3cd1cbbc2ee5c32cfdef7d5695137f755ed4c2a805707bd7 not found: ID does not exist" Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.799001 4782 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:16:01 crc kubenswrapper[4782]: I0130 20:16:01.810011 4782 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-46vjf"] Jan 30 20:16:02 crc kubenswrapper[4782]: I0130 20:16:02.432821 4782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a30d42-81e9-4b55-922c-beb500f9b4ee" path="/var/lib/kubelet/pods/b5a30d42-81e9-4b55-922c-beb500f9b4ee/volumes"